Is SEO Doomed as Search Engines Develop?

Lately I have been wondering just how much longer it will be possible to optimize web pages for front page results on the major search engines such as Google, Yahoo, and Msn. If you read my earlier article here at, you know by now that Google is experimenting with their User Interface (UI) and have been suggesting other possible searches users may be interested in, or might have meant to search for instead.

They have several of these experiments running at the same time. One is an added link in the top sponsored results. Another is a replacement of listings 6,7, and 8, with the top 3 listings from a related sometimes-commercial keyword term (detailed in the prior article). Also they provide commercial recommendations at the top of search results.

Background of Search Engine “Fixing”

Having spent several years now optimizing websites and marketing those sites over the internet, I gained a view of humanity that has me laugh histerically (sometimes so hard I think I may be spending a tad too much time on line). Some of the things that blow my mind are those who think they are smarter than Google and its team of engineers. I guarantee one thing; if you think you are smarter than Google and can out wit them, you are basically a fool. The engineers at Google are some of the very best in the industry. They also receive more tender loving care from their employer than most new born infants from their mother.

Google is the engineers’ Nirvana or Xanadu. These guys and girls are treated like royalty in many ways that most of cannot understand. Even some of the best of employers there are few corporations that can match what Google gives to their engineers. The fact that 20% of their paid work time should be used for personal projects is just the tip of the iceberg. Most companies have a problem with employees making a 20 second personal phone call, and aren’t about to pay an employees to spend one (1) day of the work week to work on their own ideas.

In a lot of SEO forums, I always see questions asking if Google uses the spam reports that people file. The answer usually goes off topic to those who fill the reports out and those who think the people filling out spam reports are rats. The real answer is yes, Google does use the spam reports. They do not remove sites by hand normally. Using the spam reports, the engineers and Google want their algorithm to do the work that is needed.

{mospagebreak title=More Background of Search Engine “Fixing”}

The Google engineers would look at the spam site and reverse engineer what the site has done in order to manipulate the Search Engine Results Pages (SERPs) and then write algorithmic equations to remove the site via googlebot & the algorithm. I imagine the pride on the engineers faces as they set the crawler loose and watch the spam web site go bye bye down the drain. I get a vicarious thrill just thinking about it.

After Google’s great purge, webmasters who think themselves brighter than this team of Google engineers, and engineers of other search engines, come up with the next great elixir that will rank pages high on the SERPs. We had the true black hatters who ran gateway pages, then when that went away, people soon found out that links were the answer to getting to the front pages and like lemmings everyone hopped on board for the greatest deception ever.

We then had marketing experts who decided selling links was a good idea, and this was promoted and marketed by all the hot shot SEOs. The only problem here was that the owners of the text link selling ad sites forgot one important thing; you don’t play in another teams house, cheat, and expect to win. Google got wind of this and set the engineers to work. It was not soon after when all the smart (cough) webmasters were soon found in the forums crying why their sites sank off the results pages like the Titanic plunging to the ocean floor.

After this debacle was discovered, a few smarter SEOs figured out that content was king. They decided that adding pages and pages of content, optimized for the SERPs, would get them the results and money they wanted. So these geniuses then built programs to scrape content from other websites and build rip-off pages quickly. The Google engineers caught wind of this, and within less than two months booted scraper websites from the SERPs.

{mospagebreak title=Thank You Blackhats for Causing the New SEO}

I personally want to thank each and every one of you geniuses for your great work. By doing the complete opposite of what you did, none of my clients has ever suffered any filter or penalty. The main reason why is that I know I am not smarter than an engineer at Google. I understand data mining, SEO, SEM, and one other very important thing; not a one of us would let someone come into our homes and rearrange things on a daily basis. So then why would anyone think that Google would allow others to come rearrange their home?

I also want to thank you for helping to destroy what was a great thing. Google’s brand was built on the free results listings they returned due to their relevancy. There was no money involved in generating revenue for Google at the time, but the boys did know there was a veritable gold mine waiting to happen.

Thanks to the geniuses, where once all links counted in ranking a web page, now about the only worthwhile link in a natural link. Thanks to you geniuses, basic SEO to the backside code is virtually useless as most all backside code optimization is ignored except for the <title> tag. Thanks to the geniuses Google now has to take the time to fully investigate a web site before determining if the website should be included and this process is taking a few months to occur. Thanks to the geniuses, we must now be careful of how many pages of content we add to web sites and with what frequency.

Those of you who think Google is not using the Google Sitemaps beta to determine legitimate from illegitimate websites (and are merely trying to index the world wide web easier) should really take some time to think this through clearly.

As the geniuses were hard at work trying to cheat Google, engineers in the company have studied limerization, tokenization, filtration and stemming. They are using these math and english core values to determine Term Weight. The Term Weight formula TF*IDF helps the search engines determine topic of web pages.

In addition to these areas, Google is now using other data to determine which web pages to place near the top results. This data includes reading flow, grammar, spelling, Automatic Topic Categorization (which allows search engines to automatically classify & categorize web pages through text analysis), and supporting ontology’s. In addition the search engines are looking more closely at user queries, time spent on pages (site stickiness), data from toolbars, and other visitor analytics such as click through rate, and traffic counts. To be honest, one thing websites need in order to attain front page results is traffic. Why would Google Yahoo or MSN recommend a website nobody visited?

{mospagebreak title=SEO, I Hardly Knew Ye}

All of these implementations have led to the slow death of search engine optimization (SEO) for front pages results! We have been watching Google fix these problems by making their spinders so sensitive they can damage rankings of legitimate sites. Don’t think it stops there. Another issue that leads me to believe SEO is dying is the new experiments of Google implementing the UI changes I wrote and mentioned earlier. As Google looks to further monetize the search results, I think more and more that you will see more inner referrals coming from the major search engine players.

The moves to increase the search engine indexes makes it appear they could soon index everything on the internet. Whether it is Google Images (drives huge traffic to jewelry websites), Froogle, Local, Books, PDFs, Mp3s, Mpegs, AVIs, Power Point Presentations, White Papers or other intellectual property not thought of as of yet, there will be less and less of the old ten (10) listings of websites.

One only needs to read the first sentence of Googles’ Mission Statement, to see why I make these claims:

Google’s mission is to organize the world’s information and make it universally accessible and useful.

Information as we all know is not restricted to just websites. It includes Google services. This tells us that other information can be added into the SERPs and around them without breaking their policy. This other information will come in the many forms I mentioned. Adding in globalization and localization shows that Google is headed this way already, as what a user in California, New York, London, Moscow, Beijing for the search query “new cars” will each receive a different set of front pages results listings on

Reading further into Google’s Mission Statement we find this sentence:

Google is now widely recognized as the world’s largest search engine — an easy-to-use free service that usually (Nice way to CYA guys!) returns relevant results in a fraction of a second.

“Relevant results” is the keyword here and another reason why I believe SEO is a dying method of attaining front pages results. Web sites are not the only thing relevant to a user’s search query. This is why Google gave us the new Desktop Search 2 tool, as it is much quicker than using the Windows Search for information stored on our computing devices.

{mospagebreak title=The New Search Experience}

Furthermore, Google has changed their mission statement. Yes, this is something that should be done as a company’s mission can change in the future.

When you visit or one of the dozens of other Google domains, you’ll be able to find information in many different languages; check stock quotes, maps, and news headlines; look up phonebook listings for every city in the United States; search more than two billion images and peruse the world’s largest archive of Usenet messages — more than 1 billion posts dating back to 1981.

As you can clearly see, it does not mention anything about websites. The emphasis is now on information coming in all shapes and forms. Some information is vocal, some information is visual, some information is written and other information is typed. When relevant, this information can drive targeted traffic to your business though various methods. Don’t be a genius and try to manipulate the search engines. That is their house and they will take whatever steps needed to protect themselves from tricks.

Just like in sports, when your team enters the home of the defending champions, your game must be executed with precision and excellence in order to win. Otherwise you will be in for a long, depressing, losing game. And unlike real sports, the home search engine can force you to lose if you don’t play by their rules!

I know a lot of people reading this will think this is crazy, but that is fine as I agree with you. Some others will be angry, others hurt, others laughing at me, and quite a few will respond that Google will never change the SERPs. I have some news for those of you thinking of replying with that line; Google has changed the result pages you see for user search queries several times since the website went live.

Put away your fanaticism for a moment and look at the situation from the view of a search engine. What is the easiest way to serve relevant content to customers who don’t like clicking a lot? Is it providing links to stock pages with a bunch of tricky/spammy sites in them, or showing a stock ticker and graph on the SERPs themselves?

[gp-comments width="770" linklove="off" ]