After reviewing the material in week one, I found the concept of bots or spiders difficult to understand. The material says "the bots “crawl” the web by going to a page and collecting every single link on that page". Huh??? Maybe I was taking it too literally, but I just didn't understand what a bot actually was and why would it want to collect every link. Since this is an unfamiliar subject and I couldn't grasp this concept, I decided to surf the web for more information.
I came across a website called Google webmaster central:
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=70897
I personally think this site was able to give me a better explanation and more details of bots. It also gave a great analogy; think of the web as looking into a really big book. In order to find anything in the book, you have to look at the index and it will direct you exactly where you need to go. These bots, which are search software, will collect documents from the web and build an index for a particular search engine like Google. So, every time you search for something on Google, the bots help by only pulling up the most relevant URL’s to your search.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment