The truth is that search engine algorithms have become so complex, accounting for so many different variables, that no one can keep track of them all. Stoney deGeyter discussed this in almost excruciating detail. So if you think you or your SEO can game the system, think again.
Let’s start with the fact that Google was already analyzing something like 200 different ranking signals back in 2009. That’s three or four years ago; today, they’re almost certainly analyzing more. And even if they aren’t, it’s a fair bet that they’re not analyzing the same 200 signals that they used then. Search engine developers at Google constantly look for ways they can make their algorithms work better. So while we can make a fair guess at some of the signals Google uses, nobody in the SEO community knows all of them.
Next, consider that not all of these signals carry the same weight. How important is it that your domain name features your target keyword? Your title tag? Anchor text? Only Google knows the answer, and they’re not talking. Worse, that answer keeps changing, because it’s only a matter of changing some code to change the importance the search engine grants to any particular signal.
Now you may have noticed my use of the term â€śalgorithms,â€ť plural, when talking about Google. That’s because the search giant doesn’t use a single algorithm. Do you suppose it uses the same algorithm for the medical industry that it uses for the construction industry? More than likely, it doesn’t. That’s because those 200-odd signals I mentioned earlier actually carry different weight and significance in different industries. How far does this granularity go? Well, deGeyter notes that Google might have anywhere between 50 to 200 different algorithms running at any one time. â€śThese different algorithms might be in play for different industries, different types of searches or testing the effects of various algorithm changes before a full push,â€ť he explained.
And it’s just gotten more complicated over time, as Google has added personalization, geo-location, and social signals as factors and filters involved in search. So any particular search can be affected by the searcher’s zip code,Â search history, previously visited sites, and even the websites his or her friends on Google+ have visited and liked â€“ oh, excuse me, +1ed.
Naturally, Panda and Penguin also make â€śgamingâ€ť Google’s algorithms more complicated. (You were wondering when I’d get to those, right?). These black-and-white critters are no ordinary updates; they’re filters through which the search giant runs its huge index from time to time. As deGeyter explains, â€śIf your site got hit with these updates, fixing the problem that caused it won’t bring immediate results. You have to wait until the next time Google runs the filter.â€ť
So what’s a site owner trying to get top rankings in Google supposed to do? Well, you may have to give up the idea of â€śtop rankingsâ€ť as such. If you look at your site in Google and you have the top ranking for your targeted keywords, turn off everything that tells Google who and where you are. If you’re still at the top, guess what? You may not be the top result for many other searchers, who have all of their customizations turned on. It’s a catch-22.
Fortunately, there is a way to resolve it. Keep an eye on algorithm changes, but don’t try to chase them. And remember what you’re really trying to do! I admit deGeyter says it better than I can: â€śIt’s not about trying to get just the right amount of words on a page, your keywords in just the right spot or even about the perfect backlink graph. It’s about building a site that visitors love, focusing on the keywords they search and doing it better than your competitors.â€ť Good luck!