Wednesday, November 3, 2010

Week 11: Web Search and OAI Protocol

Because people have the means to create their own websites, just about anyone can publish their material over the Internet. As more people are obtaining the ability to do so, sources of information can easily get lost in the shuffle. That is why there are search engines to help establish order. The device utilizes an algorithm that calculates how often a website is associated with key words, which is based upon the amount of visitors it receives. As a result, the most popular websites in every category end up reaching at the top of the list when the search begins. However, just because a website happens to be more popular, it does not necessarily mean the content would be accurate. This was when Metadata had been introduced, so that the websites can be harvested in a more meticulous manner; thus concentrating more on the quality of the content, rather than the quantity of the visitors. However, the approach is bound to be met with disagreement, or even hostility, on account that it could encourage some form of elitism. Technically speaking, Metadata is capable of providing for a select few the power to determine which websites should be rendered as superior to others, while the opinions of many are being ignored. With or without the Metadata, the search results being presented ultimately scratches the surface at most. This is why there was a need for another technology that looks much deeper within the content of the website, providing greater accuracy and efficiency from the search results. The use of the Deep Web achieves this goal through a compromise of the core attributes between the popular approach of the former and the selective approach of the latter.

Regardless as to what technology will become available, what will frequently become the case is that a minority ends up overpowering the majority. The only significant difference is whether the shared opinion of a populace or an agreed-upon decision by the elite becomes the determining factor. Whichever might prevail gets to choose their minority that is to overshadow the majority. There are always possibilities that tensions could arise between the two sides, but not all the time. Sometimes the popular choice and the right choice can in fact be one and the same, and it is instances of these mutual agreements and understandings that Michael K. Bergman’s proposition attempts to exploit. Even if only the most genuine of all websites manage to successfully become the minority, the situation becomes increasingly difficult for the majority. Although there is the relief of knowing that websites providing nothing but junk are more likely to be thrown deeper into the depths, websites of better quality are still going to potentially be ignored. A website can provide information just as professionally as any scholarly source could, but there is still no guarantee if it will achieve higher recognition. One factor that the latest innovation explained by Bergman would ignore, which might also apply to future successors, is human nature. So long as people want quick results and can get bored easily, whatever it is that any individual chooses to look into, chances are that the person will only glance at the top ten results at most and will only bother looking thoroughly through one of them (if at all). Unless a website has made an impact on the populace and the elite, it would be lucky to reach even the top twenty.

No comments:

Post a Comment