Google webspam guru Matt Cutts explained the rationale behind Penguin in a post on the Google Webmaster Central blog. He noted that “We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.” Penguin “represents another improvement in our efforts to reduce webspam and promote high quality content.” It’s supposed to decrease the rankings of sites that violate Google’s terms of service.
Cutts gave two examples of websites whose ranks he expected to see drop after Penguin. One displayed egregious keyword stuffing. The crime committed by the other site seemed a little more subtle, until you tried to read its text. It showed a poorly-written piece about exercise, with links about loans randomly scattered throughout the text. The text of the article clearly did not relate at all to the links.
The Penguin algorithm, though going live for all languages at the same time, was expected to have less of an impact than Panda. Cutts noted that Panda’s initial version affected about 12 percent of all queries to some degree; Penguin was supposed to affect only a little over three percent of all English queries. In languages with more heavily-spammed sites, Cutts wrote, Penguin could be expected to affect more queries – five percent of Polish queries, for example.
If you’ve been doing SEO for a few years, you’re probably scratching your head right now. Granted, this is just one of a number of techniques Penguin is penalizing, but really – keyword stuffing? That practice is so old, it predates Google, and no good SEO does it anymore! Is Google only now going after these black hat approaches? As Danny Sullivan observed, “It’s not, even though the blog post might give some newcomers that impression…Rather, what’s really happening is that Google is rolling out better ways that it hopes to detect such abuses.”
One reason Google hopes to do better at fighting this kind of webspam is that, unfortunately, it’s not hard to find websites for which it still works. This discourages white hat SEOs from building the kinds of great websites that Google wants searchers to find. Why go to all the work of building an excellent website (so the thinking goes) when some other site can outrank you just by spending a few hours and a little money on techniques that break Google’s Terms of Service?
In fact, Google seems to have finally gotten the message that SEO is not, in and of itself, a bad thing. If you read Cutts’s blog post, he very clearly describes – and encourages — all the good things that white hat SEOs do for websites. These blessings include making it more crawlable, translating “jargon” into words that normal searchers would use, improving usability, creating great content, improving speed, and so on. It’s not the first time that Google has endorsed white hat SEO, but since it sometimes seems as if the search engine is at war with SEOs in general (as opposed to black hat SEOs in particular), it’s good to see it in black and white.
All of this leads up to one important question: how well does Penguin work? Has it improved Google’s results? Do we now see relevant listings free from keyword stuffing, link schemes, cloaking, deliberate duplicate content and the like?
Danny Sullivan tackles that question by performing a series of experimental searches in Google Chrome’s “Incognito” mode. He compares his results to Bing as a control. He starts with “Viagra” and goes through searches one might expect to produce spammy results. He also includes queries such as “autism resources,” where one might expect medical authority sites, and “SEO,” where he can use his own knowledge of the field to determine the quality of Google’s results.
Sadly, the outcome is not exactly encouraging. The biggest apparent burn for Google is that it doesn’t include Viagra’s official site as a result on the first page of results. But a bigger burn is that, for most of the queries, it’s almost a toss-up between Google and Bing as to which search engine did the better job. As Sullivan observes, these results hit hard against Google. After all, the search engine won its market leader position by delivering better, more relevant results than anyone else in the field. Is it now falling down on the job?
The answer to this question is less obvious than you might think. Sullivan notes that both Google’s and Bing’s results have been getting more and more personalized, and when he has those functions turned on, “some of the results can be radically different.” Also, since Google isn’t exactly sharing Penguin’s secrets, it’s hard to tell what did and didn’t change in the results. Sullivan observed that “There is no lack of people commenting on how terrible things are now, but…that’s what always tends to dominate the forums after any change.”
Another point to keep in mind is that this will hardly be the final version of Penguin. As with Panda, you should probably expect Google to roll out updates as it receives feedback from users and analyzes the effects of the new algorithm on its search results. It may be a couple of months before we see Penguin’s full effects. Right now, though, Sullivan gives it a low grade, saying that he’s been seeing “too much weirdness.”
Should you worry about how Penguin affects your website’s ranking in Google? Sullivan suggests that you ignore your ranking, and watch your traffic over the course of a few days. “Is your overall traffic from Google Search much better than it was before [Penguin]? You’ve probably gained. No change? The update had no net impact on you. A severe drop? Then yes, you got dinged.” With Penguin expected to affect only three percent of searches, though, you probably have less to worry about from this new algorithm than from Panda.
How has Penguin affected you? Have you noticed any difference in your traffic? Feel free to let me know in the comments!