Really!!!-Pet Peeve #39
Although I have only been using the internet for a few years, one thing I have noticed that occurs frequently is the large number of outdated or unmaintained web pages. This type of information might be great when one is doing a research project. However, only finding information that is six or seven years old in the top returns of the search doesn’t do very much in expediting the process if one is hunting for current information Shouldn’t there be an archive section on search engines for web pages that are older than six months old? That way people who are not looking for current info can hunt the advanced search section. And whose responsibility is it to make sure the info being provided is as current as possible? Is it the search engine’s or the webpage author/master’s? Doesn’t providing an outdated search result increase the possibility of unscrupulous persons or organizations of using false information as a means of projecting a false image of current events to unwitting viewers? Shouldn’t there be a penalty for web pages that fail to provide a “last updated” statement in the ranking system no matter how many links are connected to the site. After all is there a current system that proves when the links were last utilized? The bottom line is most people are looking for the most current and accurate information when they do web searches. By failing to provide a system that ensures the links being provided in the top results of a search meet the requirement of an active site, the engines are falling short of the goal.
zero comments so far »
Please won't you leave a comment, below? It'll put some text here!
Copy link for RSS feed for comments on this post
Leave a comment
Line and paragraph breaks automatic, e-mail address never displayed, HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>