Home Login Registration Authors Contact Us About Us Subscribe
Want to receive new articles via e-mail? Click here!
: Home  :: Search Engine Optimization  :: SEO General
The Future of WebSite Ranking
by Webdevinfo - Webmaster
Views: 448
Votes: none
Rating: 0.00
Synopsis:
The recent shakeup in Google's search results, which set the SEO (search engine optimization) community buzzing and saw tens of thousands of webmasters watch their site ranking plummet, was in many ways inevitable.
Rate this article  Print this article  Email this article  View this article in PDF  Discuss this article
Page:  1
The Article

The Future of WebSite Ranking

The recent shakeup in Google's search results, which set the SEO (search engine optimization) community buzzing and saw tens of thousands of webmasters watch their site ranking plummet, was in many ways inevitable. Almost all SEO companies and most savvy webmasters had a fairly good handle on what Google considered important. And since SEO, by definition, is the art of manipulating website ranking (not always with the best interests of searchers in mind), it was only a matter of time until Google decided to make some changes.

If you've been asleep at the SEO switch, here are a few links to articles and forums that have focused on the recent changes at Google:

Articles:
http://www.sitepronews.com/archives/2003/dec/1prt.html
http://www.searchengineguide.com/lloyd/2003/1125_bl1.html
http://www.searchenginejournal.com/index.php?cat=1
http://www.accordmarketing.com/tid/archive/google-update-florida.html

Forums:
http://www.webmasterworld.com/forum3/
http://www.jimworld.com/apps/webmaster.forums/action::topiclist/forum::google/
http://www.searchguild.com/viewforum.php?f=1&sid=3d5d777a7a9c7dda31622896015f733a

To date, most of the commentary has been predictable, ranging from the critical and analytical to the speculative.

Here's a typical example from one of our SiteProNews readers:

"I'm not sure what has happened to Google's vaunted algorithm, but searches are now returning unrelated junk results as early as the second page and even first page listings are a random collection of internal pages (not index pages) from minor players in my industry (mostly re-sellers) vaguely related to my highly-focused keyword search queries."

So, what is Google trying to accomplish? As one author put it, Google has a "democratic" vision of the Web. Unfortunately for Google and the other major search engines, those with a grasp of SEO techniques were beginning to tarnish that vision by stacking the search result deck in favor of their websites.

Search Engine Optimization or Ranking Manipulation?

Author and search engine expert, Barry Lloyd commented as follows: "Google has seen their search engine results manipulated by SEOs to a significant extent over the past few years. Their reliance on PageRankT to grade the authority of pages has led to the wholesale trading and buying of links with the primary purpose of influencing rankings on Google rather than for natural linking reasons."

Given Google's dominance of search and how important ranking well in Google is to millions of websites, attempts at rank manipulation shouldn't come as a surprise to anyone. For many, achieving a high site ranking is more important than the hard work it takes to legitmately earn a good ranking.

The Problem with Current Site Ranking Methods

There will always be those who are more interested in the end result than on how they get there and site ranking that is based on site content (links, keywords, etc.) and interpreted by ranking algorithms will always be subject to manipulation. Why? Because, for now, crawlers and algorithms lack the intelligence to make informed judgements on site quality.

A short while ago, author, Mike Banks Valentine published an article entitled "SEO Mercilessly Murdered by Copywriters!" (
http://www.sitepronews.com/archives/2003/nov/21.html). The article rightly pointed out SEO's focus on making text and page structure "crawler friendly". Other SEO authors have written at great length about the need for "text, text, text" in page body content as well as in Meta, Heading, ALT, and Link tags. They are all correct and yet they are all missing (or ignoring) the point which is that the "tail is wagging the dog". Search engines are determining what is relevant, not the people using those engines. Searchers are relegated to the role of engine critics and webmasters to being students of SEO.

SEO manipulation will continue and thrive as long as search engines base their algorithms on page and link analysis. The rules may change, but the game will remain the same.

Therein lies the problem with all current search engine ranking algorithms. SEO's will always attempt to position their sites at the top of search engine results whether their sites deserve to be there or not, and search engines will continue to tweak their algorithms in an attempt to eliminate SEO loopholes. If there is a solution to this ongoing battle of vested interests, it won't come from improving page content analysis.

Incorporating User Popularity into Ranking Algorithms

The future of quality search results lies in harnessing the opinions of the Internet masses - in other words, by tying search results and site ranking to User Popularity. Google's "democratic" vision of the Web will never be achieved by manipulating algorithm criteria based on content. It will only
be achieved by factoring in what is important to people, and people will always remain the best judge of what that is. The true challenge for search engines in the future is how to incorporate web searcher input and preferences into their ranking algorithms.

Website ranking based on user popularity - the measurement of searcher visits to a site, pages viewed, time viewed, etc. - will be far less subject to manipulation and will ensure a more satisfying search experience. Why? Because web sites that receive the kiss of approval from 10,000, 100,000 or a million plus surfers a month are unlikely to disappoint new visitors.
Although some websites might achieve temporary spikes in popularity through link exchanges, inflated or false claims, email marketing, pyramid schemes, etc., these spikes would be almost impossible to sustain over the long-term. As Lincoln said "You can fool some of the people all the time. You can fool all the people some of the time. But you can't fool all the people all the time." Any effective ranking system based on surfer input will inevitably be superior to current systems.

To date, none of the major search engines have shown a serious interest in incorporating user popularity into their ranking algorithms. As of this writing, ExactSeek (
http://www.exactseek.com) is the only search engine that has implemented a site ranking algorithm based on user popularity.

Resistance to change, however, is not the only reason user data hasn't made its way into ranking algorithms. ExactSeek's new ranking algorithm was made possible only as a result of its partner arrangement with Alexa Internet, one of the oldest and largest aggregator's of user data on the Web. Alexa has been collecting user data through its toolbar (downloaded over 10 million times) since 1997 and is currently the only web entity with a large enough user base to measure site popularity and evaluate user preferences in a meaningful way.

The Challenges Facing User Popularity Based Ranking

1. The Collection Of User Data:
In order for web user data to play a significant role in search results and site ranking, it would need to be gathered in sufficient volume and detail to accurately reflect web user interests and choices. The surfing preferences of a few million toolbar users would be meaningless when applied to a search
engine database of billions of web pages. Even Alexa, with its huge store of user data, is only able to rank 3 to 4 million websites with any degree of accuracy.

2. Privacy:
The collection of user data obviously has privacy implications. Privacy concerns have become more of an issue in recent years and could hinder any attempt to collect user data on a large scale. The surfing public would need to cooperate in such an endeavor and be persuaded of the benefits.

3. Interest:
Web search continues to grow in popularity with more than 80% of Internet users relying on search engines to find what they need. However, with the exception of site owners who have a vested interest in site ranking, most web searchers have not expressed any serious dissatisfaction with the overall quality of search results delivered by the major engines. Harnessing the cooperation and active participation of this latter and much larger group would be difficult, if not impossible.

The future of web search and website ranking belongs in the hands of all Internet users, but whether it ends up there depends on how willing they are to participate in that future.


About the Author
Mel Strocen is CEO of the Jayde Online Network of websites. The Jayde network currently consists of 12 websites, including ExactSeek.com (
http://www.exactseek.com) and SiteProNews.com
(
http://www.sitepronews.com).

Page:  1
Rate this article  Print this article  Email this article  View this article in PDF  Discuss this article
Similar/related articles:
Advanced Search
Site Search:


FirstWebHosting
Top Ten Hosts as picked by our editors - with reviews and interviews.
The Host Planet
Web hosting reviews and ratings. Learn how to spot a great host.
Hosts2002
The first and greatest hosting directory with the consumer in mind.
Hostcue.com
Hosting directory for the masses with special offers Check us out!
WebDevForums
Web developers or all levels discuss the details of design and ecommerce.
Needscripts.com
Free scripts and applications for web developers.