What is it that the search engine does that cannot be done manually? The search engines adopt a technique that grabs key words of the search using yet another software called "Spider" that sucks up every link on the Web pages it scans. The links that the search engine throws up depends on the query. A carefully worded query can hit the nail on the head in the first go, but at the same time a loosely made search can throw up thousands of unwanted results, making life more miserable. With many search engines available, each claiming to be more efficient than the other, Net surfers are bound to be confused. Which is the best search engine? The answer to this question is not all that simple. It is actually quite bewildering to understand which engine will work best for a given search. A little insight into the way a search engine works could prove very useful. All search engines can broadly be categorised into five types: robotic Internet search engines, mega-indexes, simultaneous (parallel) mega-indexes, subject directories and the robotic specialized search engines. The robotic Internet search engines traverse the Web’s hypertext structure by retrieving a document, and recursively retrieving all documents that are relevant. This type of programmes are also known as "spiders," "Web wanderers," or "Web worms." The robotic search engines attempt to cover at random a significant portion of the World Wide Web. They examine that portion of the Internet with Universal Resource Locator (URL) addresses starting with http:// or with www as well as parts of the Internet with Hyper Text Markup Language (HTML) links. Popular search engines that fit into this category include AltaVista, Excite, HotBot, InfoSeek, Lycos, Open, Ultra, WebCrawler, etc. The second category, meta-indexes or mega-indexes, do not have any of their own databases. They, instead, are linked to robotic search engines. There are thousands of such mega-indexes — many could just be personal Web pages with search engine links. Some meta-indexes are @Once!, All in One, Galaxy, Internet Sleuth, Magellan, Net Search, etc. A variant of the above known as the multi-threaded meta-indexes or simultaneous (parallel) mega-indexes access robotic Internet search engines in parallel (simultaneously) and present the unified results as a single package. Two best-known simultaneous mega-indexes are MetaCrawler and Savvy Search. The fourth category — the subject directories — is often manually maintained, browsable, and is often searchable with robotic search engines. Yahoo!, being the most famous in this category. Yahoo! has several subject headings. Once a query is submitted, Yahoo! automatically connects to AltaVista for searching the Web at large. In another sense, Yahoo! is also a mega-index since its hypertext links will take you to other robotic search engines besides AltaVista.
The last in the category is the robotic specialized
search engines. These engines focus on a portion of the Internet, which
includes the World Wide Web; newsgroups and discussion lists; files
available by file transfer protocol (FTP); people (white pages);
companies (yellow pages); and software. These serve as a convenient
one-stop location with links to the specialised search engines. The
links to yellow pages, white pages, etc fall under this. Internet search has evolved rapidly since 1993, when WebCrawler become the first widely used search engine. Among the two dozen odd search engines prevalent today, Yahoo! is undoubtedly the most popular. But Google has emerged as a very powerful search engine recently. Developed by Lawrence Page and Sergey Brin at Stanford University in the UAS, it claims, "Google uses sophisticated text-matching techniques to find pages that are both important and relevant to your search. For instance, when Google analyses a page, it looks at what those pages linking to that page have to say about it. Google also prefers pages in which your query terms are near each other." For ease of use and convenience several software like Copernic have been developed which simultaneously consult the many search engines and brings back relevant results with summaries. It also removes duplicate information and dead links, making the Internet search easy. Software like Internet Detective provide scores of links based on topics like locate people, investigative resources, information and government resources, newsgroup search, etc, that make locating specific information from the Net easy. Different engines have different strong points; use the engine and features that best suit your requirements. One thing is obvious, the engine that brings up the maximum number of results is certainly not the best. The search engine that gives you a few, but specific answers is what you should choose. The best results can be got by selecting the words for the query very carefully. Research is on to classify information into categories that will really improve searching. The pursuit of finding the best way to navigate through the world of electronic information goes on.
|