What You Need to Know to Improve SEO Using CDNs

Using a Content Delivery Network (CDN) can improve page load speed and bring more traffic through better search engine positions. But only if you use it correctly. Network Optimization firm Incapsula, a provider of cloud-based CDNs that aid in defending and optimizing webites, has shared their in-depth knowledge of this topic in multiple posts on their blog. In this post I will summarize the points they made.

Debunking CDN SEO Myths

Let’s start with eliminating some common SEO myths. None of these are true:
  • Many sites on a single IP are bad for SEO; in this Google Webmasters official forum discussion the official Google rep stated “We generally do not treat sites hosted on CDNs any differently”.
  • CDNs create duplicate content; each copy of your content has exactly the same URL, so Google’s crawlers will not see multiple copies regardless of which location presents the content when they crawl it.
  • Bot blocking will stop Google’s crawlers; bot blockers only block bad bots; they never block crawlers from recognized search engines.
  • CDNs will hurt my ability to rank for my geographic location; the IP address is not the primary signal Google users to determine the location of the server that hosts your site; Google first looks at your Webmasters Tools setting and TLD country code. CDNs also whitelist their server locations to prevent any localization errors.

CDN Affect on Page Speed

We all know the importance of reducing page load times and increasing page speed. Moz has been very clear about how webite speed impacts search ranking. What many do not realize is that what really matters is “Time to First Byte” (TTFB). CDNs impact on TTFB Using a CDN will not improve your SEO unless you optimize not only how long it takes to load the first byte, but also what you load. Ilya Grigorik, developer advocate on Google’s “Make the Web Fast” team rejected a study claiming TTFB does not matter, explaining:
“It’s not only the time that matters, but also what’s in those first few bytes… Of course it doesn’t matter if you flush your ‘200 OK’ after 1ms vs 1s…. But if you know what you’re doing and craft the response right, then it can make a huge difference”.
The primary cause of slow TTFB is processor time required to dynamically generate html pages. Sites using any database driven CMS (WordPress for example) dynamically generate your home page for every new visitor. Loadtime comparison of html, javascrips, images and CSS An excellent solution would be to classify the HTML components as static and have them delivered directly from a CDN, with no processing and from the nearest possible location. Some CDNs are using advanced caching algorithms to identify and cache more html to store statically, thereby reducing the time and amount of html that must be dynamically generated.For example, Incapsula wrote in CDN SEO Benefits and Advantages:
At Incapsula we see a double (and event triple) digit site speed improvement among the websites using our service. This improvement is achieved not only by CDN content Caching and Geo Distribution capabilities, but also by clever uses of existing assets. Namely, we will automatically Minify your source code, GZip your files and even Cache your “Un-Cacheable” dynamically generated content (especially useful for websites with dynamic catalogs, on-site search features and etc). As a result your website will load faster, achieve higher SERP rankings and provide better overall User Experience, thus also improving Bounce and Conversion rates.

More advanced CDNs use various methods of compression to automatically Minify your source code and GZip your files for additional reductions in load time. CDNs can be used to improve on-page SEO and search rankings, but only if you choose the right CDN and take advantage of faster TTFB, reduced dynamic html, and increased compression.

Google Panda 4.2: Recovery Tips

This month sees another Google Panda update – the first in over 10 months. But version 4.2 is no ordinary Panda, because Google has once again changed the way it updates the search algorithm.

The first Google Panda update was rolled out over a period of a week or two in February 2011, and those that followed were updated in a similar fashion. This all changed in March 2013, when Google announced that the Panda algorithm was to be integrated into the main search algorithm and therefore updates would be effective almost immediately.

This was good news for website owners, as once again it was possible to make positive changes to the content of a website and see quick results. However, the latest Panda update will roll out over a period of several months, according to Google.

So far, few people have reported any Panda-related changes to their Google search referrals, which is as expected for such a slow change.

2%–3% of English language queries

Google has suggested that this update will affect around 2% to 3% of English language queries, which has the potential to be a massive change. What do we know about Panda 4.2? Why has Google needed to make such a big update now? This is the 28th Panda update – surely Google must have optimized this feature of the search engine by now?

What is new?

Google Panda is still a site-wide penalty that examines all of a website’s pages to determine the overall quality of a site. A few thin pages will do no harm, but a website that has hundreds or thousands of empty, thin or duplicate pages will very likely experience a drop in search engine referrals.

When the first Google Panda updates were released, many websites experienced dramatic falls in search referrals, often losing as much as 90% of all search engine traffic. In most cases, the problem was an abundance of thin pages – usually user generated profile pages and empty product pages. Deleting such pages would often lead to a Panda recovery.

For a while Panda took a back seat while Google focussed largely on Penguin, the web-spam algorithm. Now that Penguin has also been integrated into the main algorithm it seems Google is returning refocusing on on-site quality factors.

Google has made several specific updates in the last year, all of which point to quality. Google has been promoted secure sites, mobile friendly sites and has recently taking a stance against sites that display aggressive pop-ups.

The latest Panda update may simply be incorporating some of these newer quality signals into the main Panda algorithm.

How does this affect your site?

To protect your website from Google Panda you need to focus on building a quality site. This means ensuring that website code and architecture is clean to prevent thin and duplicate content pages, and also ensuring that the quality of content is high. To prevent Panda impacting your search positions, or to recover from being hit by Panda, you need to tackle these three areas:

Good web design

In this context, good web design refers to the structure and code of a website. Poorly built websites can cause duplicate and thin content issues that are interpreted by Google as being deliberately spammy.

For example, some content management systems create dynamic pages using parameters, which can be crawled and indexed by Google. Although there are ways to patch problems using Search Console, it is always best to resolve the problems by cleaning up website code and architecture.

Removal of thin content pages

Any pages that serve no real purpose should be removed. Empty pages are common in many older eCommerce websites – a product line is removed, or just out of stock, and the result is a content page with little more than a title.

Another common problem are profile pages, which are created by website members but contain no unique information. A good CMS will ensure that all of these pages are set to noindex, but unfortunately, many are not. This problem is made worse when members are allowed to add a link back to their own website in profiles – some Drupal community websites have over 100,000 profile pages that have been created by drive-by-spammers – and sites like these are affected by Panda.

Addition of quality content

Creating content-rich pages, with excellent copy and images, is a great way to ward off the dreaded Panda. Some people believe that Panda not only identifies low quality content, but also identifies when there is a lack of engagement on a page. Panda 4.0 was thought to factor user engagement – and Glenn Gabe reported on Search Engine Watch that improving user engagement can reduce the effect of Panda on a website.

A website that experiences a drop in search referrals following a Panda update can often be recovered by technical SEO experts and content managers, who together will improve site architecture and site quality. This is also why so many people are now using WordPress to run their business websites – WordPress provides a clean platform that allows owners to share quality content with ease.

If your website has been affected by the recent Panda update, contact FSE Online today to discuss your recovery package.

Will Changing Hosts Affect Search Rankings?

A common question in the SEOChat forums is whether moving your site or changing hosts will affect your SEO rankings and position in the serps.

If everything were exactly the same, moving your site between identical servers should not impact your search engine positions if the move is done correctly. But everything is never exactly the same.

Each hosting company has different hardware, networks, versions of server code, and configurations. How quickly your site loads is important. The faster you can make it, and the more reliable your hosting, the higher you will rank. Factors that impact page load time are:

  • Speed of the server
  • Demand on the server (how many sites with how much traffic)
  • Distance from the end user (the person looking for your site)
Using a CDN can greatly speed up your download times by duplicating the most often requested content on your site across multiple servers. Load balancing and failover speeds up your site and also ensures there is no downtime.

Watch Your Downtime

Extended downtime can definitely negatively impact your rankings on Google and other search engines, so choosing a reliable hosting company is essential to your success.

CDNs also provide serious security, which ensure your site is far less likely to get hacked. Lengthy outages recovering from damage hackers do can can largely be avoided.

Be sure to choose a CDN known for security as some focus more on page load speed while others focus on both.

Speed Up Download Time

If you are serious about getting your site to load faster, start with these tips. And do note the links to additional information at the bottom of that post. There are some simple things every site owner should know. For example:
  • Your site should be hosted on a server in the country whose audience you most want to reach. See Server Location as a Local Ranking Factor for details.
  • Other sites on a shared server can negatively impact your search engine positions, so choose carefully and know what other sites are on the same server with you. Use this lookup tool to find out.
How moving your site can impact your search rankings can be a complex question, but here is a good video that explains some of the variables: Video Highlights:
  • Changing servers CAN impact your rankings
  • If your site is still identical, the main concern is other poor quality sites on a shared server
  • Switching between content management systems (between Drupal or Blogger to WordPress for example) will change your URLs and directories; this can definitely cause major issues if not done correctly
  • Be sure to redirect any URLs that change so you don’t lose the value of those incoming links
If you’re not happy with your current hosting company, use this guide to moving your site. One common mistake is to take down your existing site before you’ve tested the new one.

Use a Test Site When Moving

An expert tech should leave your existing site up and load a duplicate copy of it onto the new server. That new copy should be checked over and tested to ensure it is working fine and has no issues BEFORE you re-point the DNS servers to the new location.

Another useful tip is to shorten the caching time on your existing site well before you start the move. This will reduce the amount of time it takes for all your users to see the site at the new location.

Whether changing hosts affects your search rankings will depend upon the quality of the move. If everything is done correctly, there may be a temporary drop in search rankings, but it will quickly recover.