Craziness at the Googleplex

What in the WORLD is going on at Google? Since early last year, there have been widespread reports of Google dropping caching on current pages but keeping pages two or more years old that don’t even exist on the site any longer. There have also been widespread reports of Google indexing some pages on a site, but ignoring others, with no clear reason why the spider is doing what it is. What’s the deal?

In this article I’ll shed some light on what happened, why it happened, what you can do about it … and take a look at what I believe will be the most significant update in Google history coming up early this year.

In January of last year (2006) Google went through "The Big Daddy" update.

Unfortunately, since then things at Google have been … unstable, for lack of a better way of putting it.  The reason for this is simple.  Google servers ran out of space.

I know that sounds crazy, even bizarre, but it’s true, and it was admitted by none other than Google’s CEO in April (the full story is here).  His exact words were, "Those machines are full.  We have a huge machine crisis."

For the CEO of a search engine company to admit that his servers are so full that they’ve got a crisis is huge.  If history is any indicator at all, he was probably UNDERSTATING the true extent of the problem.

This then begs the question, what did Google do about it?  Obviously they didn’t just let their servers fill up until they crashed; we know that didn’t happen.  So what did they do to at least hide the problem from search users?

They started by making changes to the spider.  The spider would no longer even attempt to index every page of a site.  Instead, it would index only "entry pages," or those pages that could be gotten to from another source (links from other sites) or had a "high likelihood" of being clicked on if the page came up in a search (how that was determined I don’t know).

By drastically reducing the number of pages that the spider would send indexing data back to the Google servers, they drastically cut the rate of growth of their index database.

The problem however is that I have reason to believe that those changes had some rather significant bugs. This was then compounded by an application that the Google engineers wrote to go through the database of cached pages to remove "no longer needed cached images."

Unfortunately, it would appear that the application had some rather severe bugs that caused current and useful pages to be dropped from the cache while some older and non-useful pages were kept.

This put Google into a state of disarray from which they really have not yet recovered (at least in my opinion).

You may have even noticed that Google is not indexing many of your pages and that the caching of your pages seems just a bit "off."  This is the fallout from the server problem (and the attempted fixes) that I just spoke about.

Now, combine the state of disarray with what’s coming down the pipe. (I call it "The New Rules"). Google is all set to completely alter the way it determines page relevance all the way down to how it determines "link strength."

You see, Google is still working under a method of determining link strength based on the linking page’s PageRank. That is about to undergo a huge change.  Google is now going to determine "link strength" by the unique number of clickthroughs of the link REGARDLESS of the PageRank of the site.

In other words, if you get a link from what is currently a PR 3 site that gets huge clickthroughs, that will be more valuable than a link from a current PR 7 site that nobody even looks at.

Essentially, Google is going to consider the strength of your site based not only on the link you get and its link text, but also by the "votes" that link gets based on usage from Internet surfers (and it’ll be calculated uniquely to cut down on shenanigans).

Further, the "votes" will be regionally based.  So if you get a lot of clickthroughs from surfers in India (but none from surfers here in the U.S.), then your site will appear near the top of searches performed in India, but will be non-existent for searches here in the United States.  This move too will cut down on shenanigans (hiring folks to simply find your links and click through them).

What it means is that all those sites relying on rankings by using companies like TextLinkAds are about to get a POUNDING … unless they are using in context links (such as the way that SEO Chat does things), which are a bit more pricey.

In my opinion, this change in the way that Google calculates link strength is going to cause the biggest upheaval in Google’s SERPs over any other update in Google’s history … and that update is coming, in my opinion, early 2007 (certainly by March).

Are you ready for it?

So the question then becomes what can you do about it?  What can you do BEFORE the update hits, to either preserve your rankings or to improve them?

First and foremost on the list of things to do is to make sure that you have a blog on your site and you are adding new KEYWORD RICH content to that blog at least a couple of times a week.  For this I typically recommend using WordPress. 

Your blog posts should be "Technorati tagged" (you can get more info on tagging by going here: http://www.technorati.com/tools/#tagging) AND you should have a Technorati account for your blog.

After you post an article, you use PinGoat (http://www.pingoat.com/) and plug your blog into it.  With this, you are "pinging" (basically this is the "blog and ping" thing you may have heard about) your blog to multiple blog "search engines."  You’ll also want to use the Technorati ping at: http://www.technorati.com/ping.html.

What these services will do is potentially get you a number of backlinks that are MUCH more likely to be used/clicked through.

Next, at least a couple of times a month you’ll want to put out a press release via PR Web (I cover this in much more detail in my "New Rules For Google SEO" at http://www.dannywall.com/newrules).  The release should be keyword rich.

One thing I want to point out about news releases … you want to write them in the third person as if a reporter was writing about you.  Secondly, you want to FOCUS YOUR RELEASE on a fairly small topic area/keyword.  You don’t want to write a release that is all things to all people.  Instead, your release should narrowly target a single key phrase.

News releases like that have a MUCH greater chance of getting clicked through.

At this point, notice what I’m saying here.  The focus IS NO LONGER on simply getting a link, but getting a link that people will actually USE.

This has a double benefit for you.  First of all, links that get used will dramatically improve your Google rankings … BUT MORE IMPORTANTLY, links that get used give you traffic right then!

I know that sounded obvious, but I find that often my clients are so focused on improving their Google rankings, and therefore the traffic they get from Google, that they completely miss out on sources of traffic RIGHT NOW (instead of however long it takes to improve your rankings).

Additionally, in the past I’ve written about the power of writing and distributing articles (you can read one of them here: http://www.seochat.com/c/a/Website-Promotion-Help/Insider-Secret-To-Killer-PR/).  I’ve talked about the traffic that you can get and the benefit to your search rankings. That strategy is going to be more powerful now than ever before!

I can’t stress that enough.  Now that Google is going to count the number of unique clickthroughs of a link, getting your articles published to authority sites will be more valuable to your rankings than ever; and that benefit doesn’t even count all of the benefit you get from the traffic you get as a result of the article in the first place.

My point here is that finding relevant authority sites, and giving them the content they need is going to be one of those competition killing strategies in very short order.  This means that you do need to begin asking yourself, "what are the authority sites in my niche?"

Find out what those sites are, find out what kinds of content they are looking for, what their guidelines are for articles, and write some good, strong, no nonsense articles for them (some authority sites will even PAY YOU to do it … SEO Chat is a good example of this).  The authority site benefits by getting the content they need.  You benefit by getting the traffic.  The customer benefits by easily finding the information they need.

In fact, you’ll notice that by focusing on getting the customer the information they need FIRST (through the various techniques discussed in this article), you benefit with more traffic and better search rankings, PARTICULARLY once Google changes its algorithm.

Simply by focusing on creating good, key word rich, highly targeted content that your potential customers might want, and then using a variety of approaches to get that content into the customer’s hands, the customer wins and you win.

Google+ Comments

Google+ Comments