Google Tells Spammers That Reconsideration Is Not An Option

Google has announced a new change to its webmaster policy – from now on, if your website gets a manual penalty, it may not get a second chance to dominate the listings.

In a change that is designed to hurt black hat SEOs more than the Penguin and Panda updates, Google has stated that it will no longer tolerate those who repeatedly breach Google’s webmaster quality guidelines.

The announcement was made on September 18, 2015, in a blog post on the official Google webmaster central blog: Repeated violations of Webmaster Guidelines. Unlike many of the blog posts that appear on the webmaster central blog, this was written anonymously. Being banned by Google is every webmaster’s worst nightmare and some people take it very personally, so it’s no surprise nobody wished to put their name to the change in policy!

Why the Change Now?

In their blog post, Google says that the change in policy is to further improve the quality of the search results. Google has carried out many major changes over the last five years that have made great progress in improving search quality, but one problem still persists – some people are still gaming the search engine.

Google started cleansing the SERPs with the Panda series of updates. These penalised websites that had low quality, poorly written, duplicate and thin content pages. The old SEO trick of creating a new page for each target keyword and filling it with spammy, keyword stuffed text was over.

Google followed this with a series of updates that tackled the problem of link spam – the now infamous Penguin updates. Suddenly, SEO was no longer about getting more links that the competition. Links now have to be earned, and acquired from very high quality sources. The inner workings of Penguin are still unclear, but it seems that just about every easy-to-get link is now devalued – many are probably totally worthless. Directory links, comment links, blogroll links, forum signatures etc. are now deemed by many to be ineffective for SEO.

However, some SEOs are still doing very well because although Google has improved its search algorithm. With some careful planning and some excellent links, you can still rank well. In short, unnatural links are still working for many people.

Manual Penalties

Google works hard to identify and manually penalise websites that buy or sell links. Whenever Google has good evidence that somebody has been selling links to pass PageRank, it applies a manual penalty.

However, in these cases, manual penalties are easy to recover from. All you need to do is remove the links (or add a nofollow attribute) and submit a reconsideration request, explaining how very sorry you are and beg to have the penalty lifted. In the past, Google trusted webmasters enough to grant them their wish, and would lift the penalty.

Unfortunately, some webmasters have exploited this and, after having a reconsideration request approved, they would start selling links again or remove the nofollow tags. This is Google’s main bugbear at the moment, and this latest change in policy directly tackles this problem. Google says that it will no longer tolerate webmasters who repeatedly breach its quality guidelines.

Google has not said exactly how harshly it will treat repeat offenders, saying only in its blog that “repeated violations may make a successful reconsideration process more difficult to achieve”. In some cases, Google may still allow a website to recover – but in other cases, there may well be no way back in the SERPs after a repeat offence.

We have already seen some websites, most notably known link sellers, completely drop out of the Google index. We predict that in the future, more sites will suffer a similar fate. If you are not a white hat SEO, take heed – your days may be numbered!

Are You Overlooking This Important SEO Factor?

No matter how brilliant an SEO you are, trying to SEO a poorly built website after the fact is a challenge. Wouldn’t it make more sense to make sure the web developer you use understands SEO? The most important SEO factor is to design a site correctly in the first place.

The advent of mobile has made it obvious that many sites are badly designed. Many site owners will have to have their sites redesigned to make them mobile friendly. When choosing a web developer, find one that knows both mobile and SEO.

Strong Demand for SEO Savvy Web Developers

According to this new Freelance Guide, web developers are among the best paid of all freelancers. That is not surprising given Google’s push to rank only mobile friendly sites. Anyone interested in freelancing or hiring a freelancer should check out that guide because it contains thorough advice.

Web developers who understand SEO, responsive design, and are wise enough not to want to custom code everything are worth their weight in gold and all too rare.

Developers who think of themselves as coders often build really awful blogs. Instead of using a quality, professional theme and plugins that already exist, they insist on their own coding – even when they have no eye for design. Avoid these developers or risk truly bad results.

Most ecommerce sites and almost all blogs are best built on existing platforms. Professional themes handle more than just appearance. They are built by teams who know SEO and handle the internal linking and navigation far better than a custom coded site (unless the coder is truly brilliant).

How to Evaluate Web Developers

Professional web designers should have an online portfolio of sites they have built. For example, look at the portfolio page for this major Sydney responsive web design company.

Click through to any of the sites they’ve built and view them on your portable devices. Or use a mobile test site to see how they appear on various size monitors.

Be sure to check multiple sites. Examine the navigation and the design. Do searches in Google and see if the site comes up for the company name and tagline.

Look at the descriptions. View source on the main pages of the site and look at the meta tags. Copy and paste the pages into an html validation service to check for errors.

If the sites in a company’s portfolio are not mobile response or SEO friendly, look for another company.

Web Developers Are Not Necessarily Web Designers

Many business owners do not realize the difference between a web designer and a web developer. A developer installs the framework and plugins. Developers may edit CSS and html. They set up email addresses, security, and how to backup the database.

Developers create excellent sites based on existing themes. Some of them can choose color schemes. They should all be able to adjust the width of the columns and the overall site. They know how to use WordPress and configure it properly.

Blog developers are not necessarily experienced at building ecommerce sites. Some are; some are not. Make sure you hire a developer with experience creating the type of site you want.

Most web developers are not web designers. A designer can create custom imagery. They are graphic artists. If you see a site with rounded corners or a unique header, those are created by web designers. Go to to see an example.

How to Find an SEO Savvy Web Developer

The preferred method of finding talent of any kind is to ask people you know if they can recommend anyone. Request an email or Skype introduction and ask for examples of their work and any contracts they use.

If you don’t find anyone directly, do some searches and look at sites. Find some you like that are ranking for keywords in their industry. Look in the footer for their developer’s name or the name of any theme they are using. (Not all sites provide either one.)

View source and search for the word WordPress. If it is an ecommerce site, look for an indication of what ecommerce platform is being used. Call the business and ask them directly. Most won’t mind telling you who built their site.

Remember that the developer may have only SEOed the main pages. If blog posts or product pages do not have SEO information on them, blame the site manager – not the developer. If the main pages are not SEOed, look for a different developer. The most important SEO factor is to design a site correctly in the first place.

Google Panda 4.2: Recovery Tips

This month sees another Google Panda update – the first in over 10 months. But version 4.2 is no ordinary Panda, because Google has once again changed the way it updates the search algorithm.

The first Google Panda update was rolled out over a period of a week or two in February 2011, and those that followed were updated in a similar fashion. This all changed in March 2013, when Google announced that the Panda algorithm was to be integrated into the main search algorithm and therefore updates would be effective almost immediately.

This was good news for website owners, as once again it was possible to make positive changes to the content of a website and see quick results. However, the latest Panda update will roll out over a period of several months, according to Google.

So far, few people have reported any Panda-related changes to their Google search referrals, which is as expected for such a slow change.

2%–3% of English language queries

Google has suggested that this update will affect around 2% to 3% of English language queries, which has the potential to be a massive change. What do we know about Panda 4.2? Why has Google needed to make such a big update now? This is the 28th Panda update – surely Google must have optimized this feature of the search engine by now?

What is new?

Google Panda is still a site-wide penalty that examines all of a website’s pages to determine the overall quality of a site. A few thin pages will do no harm, but a website that has hundreds or thousands of empty, thin or duplicate pages will very likely experience a drop in search engine referrals.

When the first Google Panda updates were released, many websites experienced dramatic falls in search referrals, often losing as much as 90% of all search engine traffic. In most cases, the problem was an abundance of thin pages – usually user generated profile pages and empty product pages. Deleting such pages would often lead to a Panda recovery.

For a while Panda took a back seat while Google focussed largely on Penguin, the web-spam algorithm. Now that Penguin has also been integrated into the main algorithm it seems Google is returning refocusing on on-site quality factors.

Google has made several specific updates in the last year, all of which point to quality. Google has been promoted secure sites, mobile friendly sites and has recently taking a stance against sites that display aggressive pop-ups.

The latest Panda update may simply be incorporating some of these newer quality signals into the main Panda algorithm.

How does this affect your site?

To protect your website from Google Panda you need to focus on building a quality site. This means ensuring that website code and architecture is clean to prevent thin and duplicate content pages, and also ensuring that the quality of content is high. To prevent Panda impacting your search positions, or to recover from being hit by Panda, you need to tackle these three areas:

Good web design

In this context, good web design refers to the structure and code of a website. Poorly built websites can cause duplicate and thin content issues that are interpreted by Google as being deliberately spammy.

For example, some content management systems create dynamic pages using parameters, which can be crawled and indexed by Google. Although there are ways to patch problems using Search Console, it is always best to resolve the problems by cleaning up website code and architecture.

Removal of thin content pages

Any pages that serve no real purpose should be removed. Empty pages are common in many older eCommerce websites – a product line is removed, or just out of stock, and the result is a content page with little more than a title.

Another common problem are profile pages, which are created by website members but contain no unique information. A good CMS will ensure that all of these pages are set to noindex, but unfortunately, many are not. This problem is made worse when members are allowed to add a link back to their own website in profiles – some Drupal community websites have over 100,000 profile pages that have been created by drive-by-spammers – and sites like these are affected by Panda.

Addition of quality content

Creating content-rich pages, with excellent copy and images, is a great way to ward off the dreaded Panda. Some people believe that Panda not only identifies low quality content, but also identifies when there is a lack of engagement on a page. Panda 4.0 was thought to factor user engagement – and Glenn Gabe reported on Search Engine Watch that improving user engagement can reduce the effect of Panda on a website.

A website that experiences a drop in search referrals following a Panda update can often be recovered by technical SEO experts and content managers, who together will improve site architecture and site quality. This is also why so many people are now using WordPress to run their business websites – WordPress provides a clean platform that allows owners to share quality content with ease.

If your website has been affected by the recent Panda update, contact FSE Online today to discuss your recovery package.