Google Boosts AMP News Pages – Are You Ready For Stage 2?

AMP Is Now Live For News Stories

Accelerated mobile pages have now been launched in Google mobile search results for news items. Prepare for stage 2 by creating AMP pages for your own site.

Google has surprised the industry by launching AMP in its search results a day earlier than expected. Accelerated Mobile Pages are now visible in some of the news listings when conducting a mobile search. This means that if you go to Google Mobile and type in a popular news story that you’re searching for, then you will see a handful of headline suggestions and some will have AMP written next to them. This is the indicator for a page that is written in AMP.

Impatient Mobile Users

Google reports that 40% of mobile users will abandon a website that takes more than 3 seconds to load. As the average mobile page takes 8 seconds, this is clearly a problem for the impatient mobile user of today and for site owners who are trying to attract their attention.

What Are Accelerated Mobile Pages?

The open source AMP project was designed by Google to counteract this problem. So far the search engine has gained backing from platforms such as Twitter and Pinterest, analytics tools providers and advertising networks. The purpose of AMP is to provide users with incredibly fast-loading mobile pages to enhance usability. The pages are designed using AMP which is a stripped down form of HTML.

Only certain elements of basic coding can be featured on an AMP page – anything too sophisticated such as forms or Javascript are not allowed as they will force the page load speed to lag. AMP pages are also heavily cached which removes the need for Google to fetch them. This is to provide the reader with a better user experience which is based on pure readability and speed of information delivery.

AMP and Social Networks

Google has been working with social networks such as Twitter and Pinterest as it acknowledges the shift in the way users rely on these types of platforms to point them to interesting links. By conducting trials on social platforms Google has been able to see the benefit of how AMP will work in the user’s real world. The platforms have reported incredible success with Pinterest seeing AMP pages loading four times faster than a standard mobile page.

AMP Rollout

The rollout so far has been focussed on news related stories, as they are primarily pages which are for reading rather than using interactively. However, stage 2 of the AMP rollout is likely to be upon us soon, so it is best that businesses prepare themselves by creating some AMP pages for their site.

Ideally, a company should create an AMP version of every page on their existing site, although this may not always be workable. The best way to do this largely depends on the type of site that is running. Those with a CMS system such as WordPress would be advised to download a plugin to do the hard work for them. Others can consult the official AMP site which provides plenty of information on how to write the strict but lightweight AMP code and then validate it using Google Chrome. Essentially you can write a page and the tool will highlight any errors on the page which will prevent it from working correctly. 

Business owners would be wise to start making the transition to AMP immediately. When Google does launch stage 2 of its rollout programme for AMP, those sites who are prepared will see the benefits from the outset.

Google’s Quality Rating Guidelines: What Stood Out for You?

A few weeks ago, Google has published their official Search Result Quality Rating Guidelines instructing their human raters on how to evaluate Google SERPs.

The guidelines provide lots of insight into how Google defines quality and what they mean their algorithm to understand.

I asked fellow marketers and bloggers to provide their main take-away from the guidelines and here are the answers:

Quality is Equivalent to the Average User's Judgment of Quality

Phil Turner

My main takeaway is that Google is looking for pages that help searchers, exactly as it has always said. Quality is basically equivalent to the average user's judgment of quality.

Yes, that is still vague, but we all know a low quality site when we see one. Similarly we all know a high quality one. We might differ in the details, but if we are talking about general perception I think most people will agree.

Keep Your Content Fresh

David Trounce

Google's Quality Rating Guidelines are a reminder for small business, especially e-commerce, to keep their content fresh. The Guidelines give special attention to freshness as a measure of its "High Needs Met" (HNM) ratings.

Page 141 of the report tells us,

"For these queries, pages about past events, old product models and prices, outdated information, etc. are not helpful. They should be considered “stale” and given low Needs Met ratings. In some cases, stale results are useless and should be rated FailsM."

If you are providing product information, make sure it it is well maintained with current data. This should include a review of the on-page SEO factors such as buzz keywords and relevant trends. You can also add value and improve your score in this area by adding fresh content surrounding product updates and new releases by a well maintained blog on your site.

For E-A-T (Expertise, Authority, Trustworthiness) websites, (which might include technology blogs or tutorial sites, for example), the freshness scale is less important since a fair amount of content in this field does not change (Think software tutorials or a first aid procedure). But business should still take advantage of the freshness factor and aim for a High Needs Met Rating by updating, improving and adding value to existing, static content from time to time.

YMYL (Your Money or Your Life) Sites Are Held to Higher Standards

Tom Shivers

According to the guidelines, Evaluators hold YMYL sites to higher standards.

YMYL is short for Your Money or Your Life sites and include medical, financial, health, safety, sites that require personal identification, provide advice on major life issues, even ecommerce sites that sell expensive products or products that have major life implications:

  • Online banking pages
  • Pages that provide info on investments, taxes, retirement planning, home purchase, buying insurance, etc.
  • Pages that provide info on health, drugs, diseases, mental health, nutrition, etc.
  • Pages that provide legal advice on divorce, child custody, creating a will, becoming a citizen, etc.
  • There are many other web pages that fall under YMYL at the discretion of the Evaluator.

When an Evaluator identifies a YMYL site, they will research its reputation:

  • Does the site have a satisfying amount of high quality main content?
  • Does the site have a high level of authoritativeness, expertise or trustworthiness?
  • Does the site have a positive reputation?
  • Does the site have helpful supplementary content?
  • Does the site have a functional page design?
  • Is the site a well-cared for and maintained website?

YMYL sites must convince Google Evaluators that they possess a healthy level of credibility and online reputation.

Google Strives to Identify "Main Content" on a Web Page

Casey Markee

I thought one of the big takeaways for me was Google's emphasis on "main content." Google was clear in instructing raters that they should be on the lookout for, and actively encouraged to, downgrade pages that have a hard time distinguishing main content from ads or other distractions on the page.

To me this is all about user experience and Google's continual desire to make sure their index provides preference to site pages that have a clear separation between advertising and content. Quality raters are encouraged to provide a less than helpful rating on pages where the lines between this separation is blurred. And that, to me, provides a great benefit to users.

Google Does Rely on Humans for Algorithm Evaluation

David Waterman (SEO Rock Star)

Having worked in the SEO industry for over 10 years, the release of the latest Google Quality Rating Guidelines is yet another reminder that Google doesn't rely 100% on bots and algorithms to determine quality online content.

It layers on a human component to ensure the results Google provides are quality and match the true intent of the search query.

Make Your Site Mobile-Friendly

Graeme Watt

The biggest takeaway for me was to make your site mobile friendly if it isn’t already. A large proportion of the guidelines was focused around mobile and it is clear Google now views this as a sign of a quality website.

If this is the case, it means that anyone producing amazing content on a site which is not mobile friendly is going to be viewed as low quality. This should be avoided at all costs. 

Google Wants to "Think" Like Human Beings Do

Louie Luc

"Quality" and "relevancy".
It just couldn't be simpler than that.

That's what users are searching for when they use a search engine like Google. That's what Google wants to offer its users.

Google aims at thinking more and more like a human being so that it may "understand", "feel" and "see" what a user understands, feels and sees when he / she visits a website suggested by a Google search.

And what are people looking for? Quality relevant sites or web pages.

Put Your Users First

Doyan Wilfred

Put your users first and foremost.

  1. Write high-quality, in-depth, well-researched articles.
  2. Write for users. Optimize for search engines.
  3. Provide helpful navigation-think breadcrumbs.
  4. Invest in clutter-free, User-friendly, mobile-friendly design.
  5. Display your address and contact information clearly.
  6. Create and maintain a positive reputation. Content won't save you if you send hitmen after your customers (true story!).

Expert Content will be Rewarded Irrelevant of the Domain Authority

Cormac Reynolds

From what I can gather, one of the main takeaways is that we're coming increasingly closer to a point where quality, expert content will be rewarded irrelevant of the domain authority of a website.

It seems the algo is coming increasingly intelligent and capable of determining the best content, so those that put the effort in sharing details and info will be rewarded. Personally, we're probably still a while away from this as an absolute, but from the look of the guidelines things are going that way.

The Fundamental Principles Are The Same: Provide Quality, Put the User First…

Tim Felmingham

There's really nothing new here, it's very similar to the guidelines leaked (supposedly unofficially!) in 2008, and a few times since. The overall message is the same as it always was – you need to build sites with original, quality content that provides real value to the searcher.

They have defined a quantitative process for assessing this, including Expertise, Authority, and Trustworthiness, and how well it meets the searcher's needs. The process is interesting, but not revolutionary, it's simply a formal definition of what we all understood anyway

Many people will flock to this document, in the hope it will give some insights into how to 'game' the system, which of course it won't! Although the general principles of the guidelines will be familiar to anybody involved in SEO, it's still well worth a read, just to make sure there aren't any key areas you have missed in your own site. It will show you how to view your site through the eyes of a Google rater, and more importantly, through the eyes of a user.

The Emphasis is on the Quality

Nashaat

It's clear that Google prefers information posted by a human rather than machine generated information to evaluate quality. They also place more emphasis on relevant indicators such as time spent on the website etc. and customer reviews. Again, the emphasis here is on content of the reviews and not the number of reviews.

And what’s your main take-away?

Google Tells Spammers That Reconsideration Is Not An Option

Google has announced a new change to its webmaster policy – from now on, if your website gets a manual penalty, it may not get a second chance to dominate the listings.

In a change that is designed to hurt black hat SEOs more than the Penguin and Panda updates, Google has stated that it will no longer tolerate those who repeatedly breach Google’s webmaster quality guidelines.

The announcement was made on September 18, 2015, in a blog post on the official Google webmaster central blog: Repeated violations of Webmaster Guidelines. Unlike many of the blog posts that appear on the webmaster central blog, this was written anonymously. Being banned by Google is every webmaster’s worst nightmare and some people take it very personally, so it’s no surprise nobody wished to put their name to the change in policy!

Why the Change Now?

In their blog post, Google says that the change in policy is to further improve the quality of the search results. Google has carried out many major changes over the last five years that have made great progress in improving search quality, but one problem still persists – some people are still gaming the search engine.

Google started cleansing the SERPs with the Panda series of updates. These penalised websites that had low quality, poorly written, duplicate and thin content pages. The old SEO trick of creating a new page for each target keyword and filling it with spammy, keyword stuffed text was over.

Google followed this with a series of updates that tackled the problem of link spam – the now infamous Penguin updates. Suddenly, SEO was no longer about getting more links that the competition. Links now have to be earned, and acquired from very high quality sources. The inner workings of Penguin are still unclear, but it seems that just about every easy-to-get link is now devalued – many are probably totally worthless. Directory links, comment links, blogroll links, forum signatures etc. are now deemed by many to be ineffective for SEO.

However, some SEOs are still doing very well because although Google has improved its search algorithm. With some careful planning and some excellent links, you can still rank well. In short, unnatural links are still working for many people.

Manual Penalties

Google works hard to identify and manually penalise websites that buy or sell links. Whenever Google has good evidence that somebody has been selling links to pass PageRank, it applies a manual penalty.

However, in these cases, manual penalties are easy to recover from. All you need to do is remove the links (or add a nofollow attribute) and submit a reconsideration request, explaining how very sorry you are and beg to have the penalty lifted. In the past, Google trusted webmasters enough to grant them their wish, and would lift the penalty.

Unfortunately, some webmasters have exploited this and, after having a reconsideration request approved, they would start selling links again or remove the nofollow tags. This is Google’s main bugbear at the moment, and this latest change in policy directly tackles this problem. Google says that it will no longer tolerate webmasters who repeatedly breach its quality guidelines.

Google has not said exactly how harshly it will treat repeat offenders, saying only in its blog that “repeated violations may make a successful reconsideration process more difficult to achieve”. In some cases, Google may still allow a website to recover – but in other cases, there may well be no way back in the SERPs after a repeat offence.

We have already seen some websites, most notably known link sellers, completely drop out of the Google index. We predict that in the future, more sites will suffer a similar fate. If you are not a white hat SEO, take heed – your days may be numbered!