In his blog William J. Turkel writes about using ‘bots’ or ‘spiders’ to do the heavy-duty task of web searching for you:
Once you start collecting large numbers of digital sources by searching for them or using an information trapping strategy, you will find that you are often in the position of wanting to download a lot of files from a given site. Obviously you can click on the links one at a time to make local copies, but that loses one of the main advantages of working with computers–letting them do your work for you. […] In addition to writing my own spiders, I’ve used a number of these packages. Here I will describe DevonAgent.
Click here to read William’s post.