We are at ClickZ this week reporting some of the most interesting panels. And we start from one of the most popular topic: Google’s updates and penalties
Chris Boggs is first on the stage talking about the history of Google updates as well as the differences between Panda and Penguin updates
Fist Google algorithm was Boston followed by Panda in 2011. Penguin came out a couple of years later.
3 “Ps” of Google updates:
- Panda hitting weak and (nearly) duplicate content
- Penguin hitting unnatural links
- Pigeon hitting local businesses
You can refer to both Google Analytics and Google Webmaster Tools to identify what kind of the penalty hit your site. Google Analytics is good for identifying algorithmic penalties (refer to the actual dates they were introduced and see if you got hit on that specific day). Google Webmaster Tools notify you of any manual penalties.
When analyzing your traffic drop, don’t forget to pay attention to the seasonality, site updates and downtimes, etc which can result in natural traffic spikes and drops without Google’s actions having anything to do with it.
Be proactive (track your backlinks, identify your on-page content issues, monitor your traffic) with identifying possible site issues (which can result in penalties) but don’t go crazy.
Jordan Koene formerly of eBay and now of Search Metrics is on stage next talking about his experience with Google’s updates.
Pigeon is a unique animal because it allows Google to adapt to our behaviors.
Trip Advisor is the biggest Pigeon winner!
Penguin vs Panda:
Penguin = drop of traffic and it’s an isolated event.
Panda = may be a slow decline (not as harsh) + tons of fluctuations (this behavior is also similar to a partial penalty).
What is Google Looking for?
Relevance and Quality
They need human reviewers to tell if they are doing a good job at providing both.
eBay has gone through many updates and filters.
Bottom line: You can be a big or a small site but you should be always looking at your content and making it better
So HOW do You Avoid the Zoo?
- Understand the data (review it constantly)
- Find the right support
- Build in a review process
- Test and learn
[Disclaimer: Contributors' views are their own. They do not necessarily represent Devshed views ]
Last month I wrote an article about Google’s payday loan update which seemed to hit several sites but bypassed YouTube.
This month I have decided to go and take a look at two of the most popular press release sites and see if they have recovered from the penalty.
The 1st site I looked at was PR Newswire, who was getting an estimated 441,000 organic visits per month in May before the penalty. As you can see in the image below they are now driving an estimated 47,800 organic visits per month to the site after the penalty.
The 2nd site I have been monitoring is PR Web, one of the other big players in the Press Release niche. PR Web was getting a whopping 760,000+ estimated organic visits per month in May and since the penalty they are getting just over 34,000 estimated organic visits.
Due to these penalties both press releases companies have made sweeping changes to what they will allow to be covered in their releases moving forward.
For instance neither agency will allow releases to be published about HCG, Green Coffee Beans, Raspberry Ketone, Garcinia Cambogia, Electronic Cigarettes or Payday Loans.
Most of these products Google will happily to advertise for profit, as seen in the image below.
If you think about the implication of these penalties it should also make it pretty clear that negative seo can be applied to most anyone, regardless of the size, age or reputation of the site in question.
Let’s put this in perspective, both of these sites have millions of existing links pointing to them over the course of several years in the business. The behavior of a few black hat SEO’s out there caused the sites to lose rankings and traffic for keywords which resulted on up to a 90% drop in estimated organic traffic.
While it is unlikely this was an intentional negative SEO attempt the result was the same, the sites were penalized due to behavior outside of their direct control and links they did not build themselves.
How difficult would it be to replicate this same pattern of bad behavior and victimize other, smaller sites that are standing in the way of your rankings?
Matt Cutts put out a YouTube video discussing how negative seo is easily combated through the disavow tool and how webmasters might just find it a minor inconvenience to disavow links.
Below is a snapshot of PR Web’s 26 MILLION links across nearly 200,000 domains I am not sure about you, but I would think reviewing even 1% of those links to be more than a “minor” inconvenience, even if you are lucky enough to have a full time webmaster on your staff.
The point is no matter what you do it is in your best interest to regularly check your backlink profile in Google Webmaster tools and 3rd party tools such as Majestic Seo. Many times the penalty is not applied right away and by the time you figure out you have been a victim of negative SEO you may have to go back months to see where those links have come from.
What do you think; do you think Google has made it too easy to use negative SEO as a tactic?
[Disclaimer: Contributors' views are their own. They do not necessarily represent Devshed views ]
A top ranking on Google is a golden ticket to success. Companies spend fortunes on Search Engine Optimisation and Google advertising to raise awareness of their site, thus boosting their place on search results. Google is fully aware that companies are willing to do almost anything to get to number one on page one. So over the last few years, they have introduced increasingly stringent regulations to ensure that high-ranking websites are not only free of spam but also contain ‘high quality’ content.
While the delineation of quality might seem rather arbitrary, Google builds increasingly complex algorithms in order to ensure that it becomes ever more difficult to get a great ranking without spending some serious money on brand building. Google would perhaps prefer sites to spend money on their own advertising, so Google is increasingly clamping down on those wishing to get a keyword bump, creating an opaque situation that requires constant vigilance.
Google is, first and foremost, a profit-generating enterprise; and the company is second to none in that regard. Its business model from the offset has been to provide a high-quality product that seems simple to the outside world but which, obviously, reflects a highly complex algorithm under the hood – able to provide the most reliable and accurate results on a consistent basis. This simplicity of user-experience was evidenced in their meteoric rise to the top of the search engine world, eviscerating their competition in the process.
There’s not much you can Ask Jeeves these days; he has gone to the cyber afterlife due to Google’s peerless quality of search. In contrast, the very term ‘search’ has been replaced with the verb ‘to Google.’ It is certain that a company has gained ubiquity once its name shifts from a noun to a verb. The danger of such a shift is complacency, and Google has been very aware of this inherent danger in the ever changing world of tech; they have innovated in various areas, from their Android mobile Operating System to their Google Maps, Google Earth, and the multitude of G-products that we all use in daily life.
One of the major ways Google has been able to stay ahead of the game is by shifting and modifying the way they calculate the popularity of sites. The general user wouldn’t notice such a change. BBC, Microsoft, and other major corporate entities still dominate their realms as do other major niche providers, but how to decide on the popularity of a travel site, or a site selling sportswear? Their “popularity” is the general answer to the question; but dig a little deeper, and it becomes apparent that popularity is something that can be gained.
Whether it comes from having the most references to the site, having most links to it, or being the most search content matching specific search terms, there is a multitude of ways to benchmark popularity and then tailor output to fit within these parameters. The people in Google know that if they are ranking unhelpful websites on page one, then their customers will be going elsewhere for their “Googling.”Thus, they alter their algorithms a fair amount.
Google Panda was Google’s 2011 attempt to restrict websites from being crammed with keywords and building link farms in order to increase their ranking. Quality, as ever, was Google’s priority here as they went about uprooting a whole industry that had been built up by exploiting the loopholes that were evident in the Google Search model.
Five years ago, it was possible, with some concerted effort, to put together a site that could dominate in chosen search terms and maintain a top spot with an increasing amount of links and cloned content. However, Google Panda put an end to all of that as Google started banning sites that had built up their popularity this way.
All of a sudden, sites that were ranking highly started to fall off the main page and into obscurity as Google’s indexing system would blacklist sites that had a negative mark against them. This was even possible for sites that were mostly original but had hidden away from plain sight practices that were banned under the Google Panda algorithm. This update was nicknamed the Famer Update as it put an end to link farms and set a number of business models into a tailspin.
Things to Watch Out for with Google Panda
- No nonsense! The algorithm is built from human test cases. Programmers analyse many sites and flag content that is off limits. They then build an algorithm around these test cases and work with the algorithm until it is able to function automatically. So, the most obvious lesson is don’t write nonsense. Grammatically incoherent work will get you flagged in no time at all.
- No duplications! If more than 90% of content on your page exists on another page on your site, you are in trouble. So be careful with your headers and borders. If you are repeating the same outline on every page, and your original content is minimal, you are in trouble.
- No advertisement overload! If your page is little more than an advert, you will be blacklisted. By all means advertise, but think smart. The algorithm is so advanced these days that you have to think of it in human terms. Would a human notice that you are hosting a site to link elsewhere? If so, Panda will too.
- No farming! Is there an overload of keywords on your site? While keywords were once the golden ticket to search engine success, now they must be used sparingly and with caution. Of course, you need to optimise your Google search terms, but if you have 100 links to ‘boost your libido,’ you are going to get picked up.
- No robotic content! If you have auto-generating content, you are in trouble. Panda identifies and blocks content that has clearly not been built by humans.
Cyrus Shepard’s August 2011 post ‘Beating Google’s Panda Update – 5 Deadly Content Sins’ is as relevant today as it was back then and we can see the forward thinking from Cyrus has been proven correct when reviewing the array of Panda updates since this algorithm update launch in February 2011.
Once rules change, new ways to play the game are quickly figured out. While Panda hit many firms hard—SEO firms, especially—it was quickly realised that with some modification, it was still possible to nail a top ranking with some intelligent application of keywords. The preferred method at this time was to position keyword heavy articles on sites, often fictitious, and link back to each other. Much of this content was humorous and whimsical, whether for travel-focused websites, dating websites, or any other industry website that it is possible to conceive of. It was rumoured that famous authors were picking up decent fees for stream of consciousness writing filled with hot topic keywords and subordinate keywords.
Before long, the ever-alert Google machine realized what was going on and decided that further algorithm changes were necessary. Thus, the reign of the Panda was not over, rather it was joined by another anthropomorphic Google algorithm.
Quality was again the catalyst for the launch of Google Penguin in April of 2012. This update targeted what it referred to as “Webspam” with the intention of penalizing sites that did not meet expected standards of quality.
Again, the topic of quality is one which seems to be defined in a rather subjective manner by Google, but overall perhaps ‘usefulness’ would be a better description. Once somebody puts in a search term, does the result they get actively address their enquiry or just point to another site that is off topic? This was the motivation behind the Penguin shift.
Things to Watch Out for with Google Penguin
- Quality. The ever-elusive entity of quality can be measured in various ways. Is the content authoritative? Is it being linked to a variety of sources? How relevant are these sources? Is there a diversity of comments? If so, you will probably be able to pass the quality threshold.
- Link relevancy. Stick to your niche. If you are posting links to sites that are too different from your own, then you may be flagged. Stick to a web of interconnected sites, and you will be able to build your niche positioning.
- Organic linking. If you suddenly have an upsurge in links, this is likely to set alarm bells ringing. Links built up over time have more cachet and add to the perception of your site as authoritative and possessing quality content.
- Diversity of anchor texts. Don’t repeat the same key words in your anchors as this is a major red flag. Use synonyms or similar terms rather than repeating yourself.
- Link quality. Getting links from sites that are red flagged will also come back to haunt you. Be sure that when you are building your links that you are looking for link quality vs. quantity. Quality begets quality after all.
Jason DeMers recently had published yet another extremely informative guide on how to recover post Penguin 2.1 in October, Penguin 2.1: What Changed Since 2.0, and How to Recover, which should give you a good overview, but Glen Gabe’s follow up findings were also extremely insightful.
How to Maximise SEO in the World of Panda and Penguin
A good place to start is to analyse your Web traffic. If you have seen a sudden dip, then there is a good chance you have fallen victim to these changes. Google publishes the dates of updates to its algorithm, so compare these with the work you have already done. If you find that you have suffered on these dates, then you need to figure out how this particular update has negatively affected you and start to make some changes.
If you can target the issue and resolve it, wait for 20-30 days and check your traffic again. If there has been no recovery, then you need to go back again and make further changes.
As ever, Rand Fishkin leads the way when it comes to all things SEO, and his presentation at the Digital Marketing summit is essential viewing for all those who wish to understand the issues facing companies who wish to dominate on Google today.
In conclusion, the key lessons to take from all of this is that Google is going to keep innovating, and it is going to keep aiming to increase the quality of search results. Expect further changes in years to come as Google seeks to solidify itself and remain top dog.
Be extra careful with anything you publish on your site. If it is a copy-and-paste job, then you risk a ban. If you are full of links that seem irrelevant, you will be banned. Low quality content can bring your site at risk of being banned. Google’s algorithm is turning into an editor which expects high-quality content if it is going to be giving front-page space on its search engine. So, keep the editor happy with clean, well-written, and interesting prose; connect with a professional SEO company, and watch your stock rise.