Inadequacies of Search Engines

Google Sniper

Search Engine Traffic Guide

Get Instant Access

A complete listing of web sites and their documents currently does not exist. Instead, consumers need to visit different search sites or relevant web sites that might have useful links. This lack of a complete directory is not in itself a new problem. In physical markets, a telephone directory only lists local businesses, and there are a number of specialized directories for different industries and markets. However, there is no reason why all information housed in a library's reference section cannot be combined into one database, especially on the Internet. Combining different Internet search databases can further alleviate the hassle of having to use several search engines and the duplicative costs of having many users collecting the same information. To recover the cost of compiling an Internet database, more and more search engines are preoccupied with soliciting advertisers rather than improving data integrity and search efficiency. Search engines may be one of a few Internet services that are truly essential in enhancing the usability and usefulness of the Internet for commerce. An incomplete search engine is as useful as a partial phone directory.

Internet search databases are also inaccurate and outdated because web sites are constantly changing. They often give consumers links that no longer exist.

Page 282

In such an environment, updating may require as much effort as compiling the initial database. An alternative may be to accept—or require—submissions by site owners about changes. Another inaccuracy stems from web sites misrepresenting and pretending to be something that they are not. That possibility compels data compilers to verify each site manually, further increasing the costs of maintaining an accurate database. A more coordinated system of feedback among content providers, users, and search engines are needed.

A third inadequacy of current search engines are the irrelevancy of some sites matching search keywords. One problem stems from the lack of sophisticated and complex search mechanisms to weed out irrelevant information. Equally lacking is a proper description for each web site and its materials upon which to base a search. As a result, a simple search often produces tens of thousands of meaningless links. Digital document metadata standards need to be established and accepted by content providers and become part of content creation.

Finally, search results need to be objective. Results can be skewed if the database itself consists of information that is pre-selected based on arbitrary criteria. Some search engines do not include personal homepages or materials residing on university web sites. Others reject web sites that are considered offensive, indecent, or frivolous by their own standards. Also, with the increasing commercialization, some search engine providers may give preference to paying advertisers. Although all these are reasonable behaviors for private enterprises, what would be the use of a phone directory that omitted all "Smiths" or those living in an area with a particular zip code? An Internet search engine is no longer just a springboard for Internet surfing. Rather, as an essential infrastructure, its database needs to be complete and accurate to foster an efficient information exchange.

Was this article helpful?

0 0
12 SEO Myths Busted

12 SEO Myths Busted

Within this guide you will find 12 cold, bitter truths about search engine optimization myths that have been busted. This is a common myth that is fed to new website owners as a quick way to get backlinks and traffic. If youve got a new site, the best thing to do is to find a lot of related blogs and post comments, right? Wrong. Most, if not all, blogs have nofollow tags within their code. Nofollow tags will stop the page rank and SEO from one blog from going into the other.

Get My Free Ebook


Post a comment