This month sees another Google Panda update – the first in over 10 months. But version 4.2 is no ordinary Panda, because Google has once again changed the way it updates the search algorithm.
The first Google Panda update was rolled out over a period of a week or two in February 2011, and those that followed were updated in a similar fashion. This all changed in March 2013, when Google announced that the Panda algorithm was to be integrated into the main search algorithm and therefore updates would be effective almost immediately.
This was good news for website owners, as once again it was possible to make positive changes to the content of a website and see quick results. However, the latest Panda update will roll out over a period of several months, according to Google.
So far, few people have reported any Panda-related changes to their Google search referrals, which is as expected for such a slow change.
2%–3% of English language queries
Google has suggested that this update will affect around 2% to 3% of English language queries, which has the potential to be a massive change. What do we know about Panda 4.2? Why has Google needed to make such a big update now? This is the 28th Panda update – surely Google must have optimized this feature of the search engine by now?
What is new?
Google Panda is still a site-wide penalty that examines all of a website’s pages to determine the overall quality of a site. A few thin pages will do no harm, but a website that has hundreds or thousands of empty, thin or duplicate pages will very likely experience a drop in search engine referrals.
When the first Google Panda updates were released, many websites experienced dramatic falls in search referrals, often losing as much as 90% of all search engine traffic. In most cases, the problem was an abundance of thin pages – usually user generated profile pages and empty product pages. Deleting such pages would often lead to a Panda recovery.
For a while Panda took a back seat while Google focussed largely on Penguin, the web-spam algorithm. Now that Penguin has also been integrated into the main algorithm it seems Google is returning refocusing on on-site quality factors.
Google has made several specific updates in the last year, all of which point to quality. Google has been promoted secure sites, mobile friendly sites and has recently taking a stance against sites that display aggressive pop-ups.
The latest Panda update may simply be incorporating some of these newer quality signals into the main Panda algorithm.
How does this affect your site?
To protect your website from Google Panda you need to focus on building a quality site. This means ensuring that website code and architecture is clean to prevent thin and duplicate content pages, and also ensuring that the quality of content is high. To prevent Panda impacting your search positions, or to recover from being hit by Panda, you need to tackle these three areas:
Good web design
In this context, good web design refers to the structure and code of a website. Poorly built websites can cause duplicate and thin content issues that are interpreted by Google as being deliberately spammy.
For example, some content management systems create dynamic pages using parameters, which can be crawled and indexed by Google. Although there are ways to patch problems using Search Console, it is always best to resolve the problems by cleaning up website code and architecture.
Removal of thin content pages
Any pages that serve no real purpose should be removed. Empty pages are common in many older eCommerce websites – a product line is removed, or just out of stock, and the result is a content page with little more than a title.
Another common problem are profile pages, which are created by website members but contain no unique information. A good CMS will ensure that all of these pages are set to noindex, but unfortunately, many are not. This problem is made worse when members are allowed to add a link back to their own website in profiles – some Drupal community websites have over 100,000 profile pages that have been created by drive-by-spammers – and sites like these are affected by Panda.
Addition of quality content
Creating content-rich pages, with excellent copy and images, is a great way to ward off the dreaded Panda. Some people believe that Panda not only identifies low quality content, but also identifies when there is a lack of engagement on a page. Panda 4.0 was thought to factor user engagement – and Glenn Gabe reported on Search Engine Watch that improving user engagement can reduce the effect of Panda on a website.
A website that experiences a drop in search referrals following a Panda update can often be recovered by technical SEO experts and content managers, who together will improve site architecture and site quality. This is also why so many people are now using WordPress to run their business websites – WordPress provides a clean platform that allows owners to share quality content with ease.
If your website has been affected by the recent Panda update, contact FSE Online today to discuss your recovery package.
[Disclaimer: Contributors’ views are their own. They do not necessarily represent Devshed views ]
A top ranking on Google is a golden ticket to success. Companies spend fortunes on Search Engine Optimisation and Google advertising to raise awareness of their site, thus boosting their place on search results. Google is fully aware that companies are willing to do almost anything to get to number one on page one. So over the last few years, they have introduced increasingly stringent regulations to ensure that high-ranking websites are not only free of spam but also contain ‘high quality’ content.
While the delineation of quality might seem rather arbitrary, Google builds increasingly complex algorithms in order to ensure that it becomes ever more difficult to get a great ranking without spending some serious money on brand building. Google would perhaps prefer sites to spend money on their own advertising, so Google is increasingly clamping down on those wishing to get a keyword bump, creating an opaque situation that requires constant vigilance.
Google is, first and foremost, a profit-generating enterprise; and the company is second to none in that regard. Its business model from the offset has been to provide a high-quality product that seems simple to the outside world but which, obviously, reflects a highly complex algorithm under the hood – able to provide the most reliable and accurate results on a consistent basis. This simplicity of user-experience was evidenced in their meteoric rise to the top of the search engine world, eviscerating their competition in the process.
There’s not much you can Ask Jeeves these days; he has gone to the cyber afterlife due to Google’s peerless quality of search. In contrast, the very term ‘search’ has been replaced with the verb ‘to Google.’ It is certain that a company has gained ubiquity once its name shifts from a noun to a verb. The danger of such a shift is complacency, and Google has been very aware of this inherent danger in the ever changing world of tech; they have innovated in various areas, from their Android mobile Operating System to their Google Maps, Google Earth, and the multitude of G-products that we all use in daily life.
One of the major ways Google has been able to stay ahead of the game is by shifting and modifying the way they calculate the popularity of sites. The general user wouldn’t notice such a change. BBC, Microsoft, and other major corporate entities still dominate their realms as do other major niche providers, but how to decide on the popularity of a travel site, or a site selling sportswear? Their “popularity” is the general answer to the question; but dig a little deeper, and it becomes apparent that popularity is something that can be gained.
Whether it comes from having the most references to the site, having most links to it, or being the most search content matching specific search terms, there is a multitude of ways to benchmark popularity and then tailor output to fit within these parameters. The people in Google know that if they are ranking unhelpful websites on page one, then their customers will be going elsewhere for their “Googling.”Thus, they alter their algorithms a fair amount.
Google Panda was Google’s 2011 attempt to restrict websites from being crammed with keywords and building link farms in order to increase their ranking. Quality, as ever, was Google’s priority here as they went about uprooting a whole industry that had been built up by exploiting the loopholes that were evident in the Google Search model.
Five years ago, it was possible, with some concerted effort, to put together a site that could dominate in chosen search terms and maintain a top spot with an increasing amount of links and cloned content. However, Google Panda put an end to all of that as Google started banning sites that had built up their popularity this way.
All of a sudden, sites that were ranking highly started to fall off the main page and into obscurity as Google’s indexing system would blacklist sites that had a negative mark against them. This was even possible for sites that were mostly original but had hidden away from plain sight practices that were banned under the Google Panda algorithm. This update was nicknamed the Famer Update as it put an end to link farms and set a number of business models into a tailspin.
Things to Watch Out for with Google Panda
- No nonsense! The algorithm is built from human test cases. Programmers analyse many sites and flag content that is off limits. They then build an algorithm around these test cases and work with the algorithm until it is able to function automatically. So, the most obvious lesson is don’t write nonsense. Grammatically incoherent work will get you flagged in no time at all.
- No duplications! If more than 90% of content on your page exists on another page on your site, you are in trouble. So be careful with your headers and borders. If you are repeating the same outline on every page, and your original content is minimal, you are in trouble.
- No advertisement overload! If your page is little more than an advert, you will be blacklisted. By all means advertise, but think smart. The algorithm is so advanced these days that you have to think of it in human terms. Would a human notice that you are hosting a site to link elsewhere? If so, Panda will too.
- No farming! Is there an overload of keywords on your site? While keywords were once the golden ticket to search engine success, now they must be used sparingly and with caution. Of course, you need to optimise your Google search terms, but if you have 100 links to ‘boost your libido,’ you are going to get picked up.
- No robotic content! If you have auto-generating content, you are in trouble. Panda identifies and blocks content that has clearly not been built by humans.
Cyrus Shepard’s August 2011 post ‘Beating Google’s Panda Update – 5 Deadly Content Sins’ is as relevant today as it was back then and we can see the forward thinking from Cyrus has been proven correct when reviewing the array of Panda updates since this algorithm update launch in February 2011.
Once rules change, new ways to play the game are quickly figured out. While Panda hit many firms hard—SEO firms, especially—it was quickly realised that with some modification, it was still possible to nail a top ranking with some intelligent application of keywords. The preferred method at this time was to position keyword heavy articles on sites, often fictitious, and link back to each other. Much of this content was humorous and whimsical, whether for travel-focused websites, dating websites, or any other industry website that it is possible to conceive of. It was rumoured that famous authors were picking up decent fees for stream of consciousness writing filled with hot topic keywords and subordinate keywords.
Before long, the ever-alert Google machine realized what was going on and decided that further algorithm changes were necessary. Thus, the reign of the Panda was not over, rather it was joined by another anthropomorphic Google algorithm.
Quality was again the catalyst for the launch of Google Penguin in April of 2012. This update targeted what it referred to as “Webspam” with the intention of penalizing sites that did not meet expected standards of quality.
Again, the topic of quality is one which seems to be defined in a rather subjective manner by Google, but overall perhaps ‘usefulness’ would be a better description. Once somebody puts in a search term, does the result they get actively address their enquiry or just point to another site that is off topic? This was the motivation behind the Penguin shift.
Things to Watch Out for with Google Penguin
- Quality. The ever-elusive entity of quality can be measured in various ways. Is the content authoritative? Is it being linked to a variety of sources? How relevant are these sources? Is there a diversity of comments? If so, you will probably be able to pass the quality threshold.
- Link relevancy. Stick to your niche. If you are posting links to sites that are too different from your own, then you may be flagged. Stick to a web of interconnected sites, and you will be able to build your niche positioning.
- Organic linking. If you suddenly have an upsurge in links, this is likely to set alarm bells ringing. Links built up over time have more cachet and add to the perception of your site as authoritative and possessing quality content.
- Diversity of anchor texts. Don’t repeat the same key words in your anchors as this is a major red flag. Use synonyms or similar terms rather than repeating yourself.
- Link quality. Getting links from sites that are red flagged will also come back to haunt you. Be sure that when you are building your links that you are looking for link quality vs. quantity. Quality begets quality after all.
Jason DeMers recently had published yet another extremely informative guide on how to recover post Penguin 2.1 in October, Penguin 2.1: What Changed Since 2.0, and How to Recover, which should give you a good overview, but Glen Gabe’s follow up findings were also extremely insightful.
How to Maximise SEO in the World of Panda and Penguin
A good place to start is to analyse your Web traffic. If you have seen a sudden dip, then there is a good chance you have fallen victim to these changes. Google publishes the dates of updates to its algorithm, so compare these with the work you have already done. If you find that you have suffered on these dates, then you need to figure out how this particular update has negatively affected you and start to make some changes.
If you can target the issue and resolve it, wait for 20-30 days and check your traffic again. If there has been no recovery, then you need to go back again and make further changes.
As ever, Rand Fishkin leads the way when it comes to all things SEO, and his presentation at the Digital Marketing summit is essential viewing for all those who wish to understand the issues facing companies who wish to dominate on Google today.
In conclusion, the key lessons to take from all of this is that Google is going to keep innovating, and it is going to keep aiming to increase the quality of search results. Expect further changes in years to come as Google seeks to solidify itself and remain top dog.
Be extra careful with anything you publish on your site. If it is a copy-and-paste job, then you risk a ban. If you are full of links that seem irrelevant, you will be banned. Low quality content can bring your site at risk of being banned. Google’s algorithm is turning into an editor which expects high-quality content if it is going to be giving front-page space on its search engine. So, keep the editor happy with clean, well-written, and interesting prose; connect with a professional SEO company, and watch your stock rise.