Google’s Quality Rating Guidelines: What Stood Out for You?

A few weeks ago, Google has published their official Search Result Quality Rating Guidelines instructing their human raters on how to evaluate Google SERPs.

The guidelines provide lots of insight into how Google defines quality and what they mean their algorithm to understand.

I asked fellow marketers and bloggers to provide their main take-away from the guidelines and here are the answers:

Quality is Equivalent to the Average User's Judgment of Quality

Phil Turner

My main takeaway is that Google is looking for pages that help searchers, exactly as it has always said. Quality is basically equivalent to the average user's judgment of quality.

Yes, that is still vague, but we all know a low quality site when we see one. Similarly we all know a high quality one. We might differ in the details, but if we are talking about general perception I think most people will agree.

Keep Your Content Fresh

David Trounce

Google's Quality Rating Guidelines are a reminder for small business, especially e-commerce, to keep their content fresh. The Guidelines give special attention to freshness as a measure of its "High Needs Met" (HNM) ratings.

Page 141 of the report tells us,

"For these queries, pages about past events, old product models and prices, outdated information, etc. are not helpful. They should be considered “stale” and given low Needs Met ratings. In some cases, stale results are useless and should be rated FailsM."

If you are providing product information, make sure it it is well maintained with current data. This should include a review of the on-page SEO factors such as buzz keywords and relevant trends. You can also add value and improve your score in this area by adding fresh content surrounding product updates and new releases by a well maintained blog on your site.

For E-A-T (Expertise, Authority, Trustworthiness) websites, (which might include technology blogs or tutorial sites, for example), the freshness scale is less important since a fair amount of content in this field does not change (Think software tutorials or a first aid procedure). But business should still take advantage of the freshness factor and aim for a High Needs Met Rating by updating, improving and adding value to existing, static content from time to time.

YMYL (Your Money or Your Life) Sites Are Held to Higher Standards

Tom Shivers

According to the guidelines, Evaluators hold YMYL sites to higher standards.

YMYL is short for Your Money or Your Life sites and include medical, financial, health, safety, sites that require personal identification, provide advice on major life issues, even ecommerce sites that sell expensive products or products that have major life implications:

  • Online banking pages
  • Pages that provide info on investments, taxes, retirement planning, home purchase, buying insurance, etc.
  • Pages that provide info on health, drugs, diseases, mental health, nutrition, etc.
  • Pages that provide legal advice on divorce, child custody, creating a will, becoming a citizen, etc.
  • There are many other web pages that fall under YMYL at the discretion of the Evaluator.

When an Evaluator identifies a YMYL site, they will research its reputation:

  • Does the site have a satisfying amount of high quality main content?
  • Does the site have a high level of authoritativeness, expertise or trustworthiness?
  • Does the site have a positive reputation?
  • Does the site have helpful supplementary content?
  • Does the site have a functional page design?
  • Is the site a well-cared for and maintained website?

YMYL sites must convince Google Evaluators that they possess a healthy level of credibility and online reputation.

Google Strives to Identify "Main Content" on a Web Page

Casey Markee

I thought one of the big takeaways for me was Google's emphasis on "main content." Google was clear in instructing raters that they should be on the lookout for, and actively encouraged to, downgrade pages that have a hard time distinguishing main content from ads or other distractions on the page.

To me this is all about user experience and Google's continual desire to make sure their index provides preference to site pages that have a clear separation between advertising and content. Quality raters are encouraged to provide a less than helpful rating on pages where the lines between this separation is blurred. And that, to me, provides a great benefit to users.

Google Does Rely on Humans for Algorithm Evaluation

David Waterman (SEO Rock Star)

Having worked in the SEO industry for over 10 years, the release of the latest Google Quality Rating Guidelines is yet another reminder that Google doesn't rely 100% on bots and algorithms to determine quality online content.

It layers on a human component to ensure the results Google provides are quality and match the true intent of the search query.

Make Your Site Mobile-Friendly

Graeme Watt

The biggest takeaway for me was to make your site mobile friendly if it isn’t already. A large proportion of the guidelines was focused around mobile and it is clear Google now views this as a sign of a quality website.

If this is the case, it means that anyone producing amazing content on a site which is not mobile friendly is going to be viewed as low quality. This should be avoided at all costs. 

Google Wants to "Think" Like Human Beings Do

Louie Luc

"Quality" and "relevancy".
It just couldn't be simpler than that.

That's what users are searching for when they use a search engine like Google. That's what Google wants to offer its users.

Google aims at thinking more and more like a human being so that it may "understand", "feel" and "see" what a user understands, feels and sees when he / she visits a website suggested by a Google search.

And what are people looking for? Quality relevant sites or web pages.

Put Your Users First

Doyan Wilfred

Put your users first and foremost.

  1. Write high-quality, in-depth, well-researched articles.
  2. Write for users. Optimize for search engines.
  3. Provide helpful navigation-think breadcrumbs.
  4. Invest in clutter-free, User-friendly, mobile-friendly design.
  5. Display your address and contact information clearly.
  6. Create and maintain a positive reputation. Content won't save you if you send hitmen after your customers (true story!).

Expert Content will be Rewarded Irrelevant of the Domain Authority

Cormac Reynolds

From what I can gather, one of the main takeaways is that we're coming increasingly closer to a point where quality, expert content will be rewarded irrelevant of the domain authority of a website.

It seems the algo is coming increasingly intelligent and capable of determining the best content, so those that put the effort in sharing details and info will be rewarded. Personally, we're probably still a while away from this as an absolute, but from the look of the guidelines things are going that way.

The Fundamental Principles Are The Same: Provide Quality, Put the User First…

Tim Felmingham

There's really nothing new here, it's very similar to the guidelines leaked (supposedly unofficially!) in 2008, and a few times since. The overall message is the same as it always was – you need to build sites with original, quality content that provides real value to the searcher.

They have defined a quantitative process for assessing this, including Expertise, Authority, and Trustworthiness, and how well it meets the searcher's needs. The process is interesting, but not revolutionary, it's simply a formal definition of what we all understood anyway

Many people will flock to this document, in the hope it will give some insights into how to 'game' the system, which of course it won't! Although the general principles of the guidelines will be familiar to anybody involved in SEO, it's still well worth a read, just to make sure there aren't any key areas you have missed in your own site. It will show you how to view your site through the eyes of a Google rater, and more importantly, through the eyes of a user.

The Emphasis is on the Quality

Nashaat

It's clear that Google prefers information posted by a human rather than machine generated information to evaluate quality. They also place more emphasis on relevant indicators such as time spent on the website etc. and customer reviews. Again, the emphasis here is on content of the reviews and not the number of reviews.

And what’s your main take-away?

Google Tells Spammers That Reconsideration Is Not An Option

Google has announced a new change to its webmaster policy – from now on, if your website gets a manual penalty, it may not get a second chance to dominate the listings.

In a change that is designed to hurt black hat SEOs more than the Penguin and Panda updates, Google has stated that it will no longer tolerate those who repeatedly breach Google’s webmaster quality guidelines.

The announcement was made on September 18, 2015, in a blog post on the official Google webmaster central blog: Repeated violations of Webmaster Guidelines. Unlike many of the blog posts that appear on the webmaster central blog, this was written anonymously. Being banned by Google is every webmaster’s worst nightmare and some people take it very personally, so it’s no surprise nobody wished to put their name to the change in policy!

Why the Change Now?

In their blog post, Google says that the change in policy is to further improve the quality of the search results. Google has carried out many major changes over the last five years that have made great progress in improving search quality, but one problem still persists – some people are still gaming the search engine.

Google started cleansing the SERPs with the Panda series of updates. These penalised websites that had low quality, poorly written, duplicate and thin content pages. The old SEO trick of creating a new page for each target keyword and filling it with spammy, keyword stuffed text was over.

Google followed this with a series of updates that tackled the problem of link spam – the now infamous Penguin updates. Suddenly, SEO was no longer about getting more links that the competition. Links now have to be earned, and acquired from very high quality sources. The inner workings of Penguin are still unclear, but it seems that just about every easy-to-get link is now devalued – many are probably totally worthless. Directory links, comment links, blogroll links, forum signatures etc. are now deemed by many to be ineffective for SEO.

However, some SEOs are still doing very well because although Google has improved its search algorithm. With some careful planning and some excellent links, you can still rank well. In short, unnatural links are still working for many people.

Manual Penalties

Google works hard to identify and manually penalise websites that buy or sell links. Whenever Google has good evidence that somebody has been selling links to pass PageRank, it applies a manual penalty.

However, in these cases, manual penalties are easy to recover from. All you need to do is remove the links (or add a nofollow attribute) and submit a reconsideration request, explaining how very sorry you are and beg to have the penalty lifted. In the past, Google trusted webmasters enough to grant them their wish, and would lift the penalty.

Unfortunately, some webmasters have exploited this and, after having a reconsideration request approved, they would start selling links again or remove the nofollow tags. This is Google’s main bugbear at the moment, and this latest change in policy directly tackles this problem. Google says that it will no longer tolerate webmasters who repeatedly breach its quality guidelines.

Google has not said exactly how harshly it will treat repeat offenders, saying only in its blog that “repeated violations may make a successful reconsideration process more difficult to achieve”. In some cases, Google may still allow a website to recover – but in other cases, there may well be no way back in the SERPs after a repeat offence.

We have already seen some websites, most notably known link sellers, completely drop out of the Google index. We predict that in the future, more sites will suffer a similar fate. If you are not a white hat SEO, take heed – your days may be numbered!

Google Panda 4.2: Recovery Tips

This month sees another Google Panda update – the first in over 10 months. But version 4.2 is no ordinary Panda, because Google has once again changed the way it updates the search algorithm.

The first Google Panda update was rolled out over a period of a week or two in February 2011, and those that followed were updated in a similar fashion. This all changed in March 2013, when Google announced that the Panda algorithm was to be integrated into the main search algorithm and therefore updates would be effective almost immediately.

This was good news for website owners, as once again it was possible to make positive changes to the content of a website and see quick results. However, the latest Panda update will roll out over a period of several months, according to Google.

So far, few people have reported any Panda-related changes to their Google search referrals, which is as expected for such a slow change.

2%–3% of English language queries

Google has suggested that this update will affect around 2% to 3% of English language queries, which has the potential to be a massive change. What do we know about Panda 4.2? Why has Google needed to make such a big update now? This is the 28th Panda update – surely Google must have optimized this feature of the search engine by now?

What is new?

Google Panda is still a site-wide penalty that examines all of a website’s pages to determine the overall quality of a site. A few thin pages will do no harm, but a website that has hundreds or thousands of empty, thin or duplicate pages will very likely experience a drop in search engine referrals.

When the first Google Panda updates were released, many websites experienced dramatic falls in search referrals, often losing as much as 90% of all search engine traffic. In most cases, the problem was an abundance of thin pages – usually user generated profile pages and empty product pages. Deleting such pages would often lead to a Panda recovery.

For a while Panda took a back seat while Google focussed largely on Penguin, the web-spam algorithm. Now that Penguin has also been integrated into the main algorithm it seems Google is returning refocusing on on-site quality factors.

Google has made several specific updates in the last year, all of which point to quality. Google has been promoted secure sites, mobile friendly sites and has recently taking a stance against sites that display aggressive pop-ups.

The latest Panda update may simply be incorporating some of these newer quality signals into the main Panda algorithm.

How does this affect your site?

To protect your website from Google Panda you need to focus on building a quality site. This means ensuring that website code and architecture is clean to prevent thin and duplicate content pages, and also ensuring that the quality of content is high. To prevent Panda impacting your search positions, or to recover from being hit by Panda, you need to tackle these three areas:

Good web design

In this context, good web design refers to the structure and code of a website. Poorly built websites can cause duplicate and thin content issues that are interpreted by Google as being deliberately spammy.

For example, some content management systems create dynamic pages using parameters, which can be crawled and indexed by Google. Although there are ways to patch problems using Search Console, it is always best to resolve the problems by cleaning up website code and architecture.

Removal of thin content pages

Any pages that serve no real purpose should be removed. Empty pages are common in many older eCommerce websites – a product line is removed, or just out of stock, and the result is a content page with little more than a title.

Another common problem are profile pages, which are created by website members but contain no unique information. A good CMS will ensure that all of these pages are set to noindex, but unfortunately, many are not. This problem is made worse when members are allowed to add a link back to their own website in profiles – some Drupal community websites have over 100,000 profile pages that have been created by drive-by-spammers – and sites like these are affected by Panda.

Addition of quality content

Creating content-rich pages, with excellent copy and images, is a great way to ward off the dreaded Panda. Some people believe that Panda not only identifies low quality content, but also identifies when there is a lack of engagement on a page. Panda 4.0 was thought to factor user engagement – and Glenn Gabe reported on Search Engine Watch that improving user engagement can reduce the effect of Panda on a website.

A website that experiences a drop in search referrals following a Panda update can often be recovered by technical SEO experts and content managers, who together will improve site architecture and site quality. This is also why so many people are now using WordPress to run their business websites – WordPress provides a clean platform that allows owners to share quality content with ease.

If your website has been affected by the recent Panda update, contact FSE Online today to discuss your recovery package.