Long-term SEO in Competitive Niches: How We Survived all Google Updates

[Disclaimer: Contributors' views are their own. They do not necessarily represent Devshed views ]

Barry Schwartz has listed the most competitive niches in SEO: gambling, mortgage, hosting, real estate, travel, etc. We are into grey/black-hat SEO in one of these niches for 7 years already. Our sites have been in TOP10 by “online casino/slots/blackjack/…” and still remain there by less competitive but high ROI keywords. We have started with black hat – still, we’ve invested much into long-term SEO as it was obvious that Google will be improving its algorithms. Most of the sites where we applied a long-term strategy were not hit either by Google Penguin or by Google Panda updates.

Famous Moz Search Engine Ranking Factors Survey investigated on the weight of top SEO ranking factors in Google: 40% – links, 31% – content, 8% – brand, 8% – user/usage/query data, 7% – social, 6% – other. At the same time, in really competitive niches content and user/usage/query data are not an issue – you have already done everything possible by default – just because all of your competitors are doing this. Thus, sites with good content are competing for influence by means of a backlink profile.

If you have a “Dentist Eaton Colorado 7th Street” site you may use natural link building: local business directories, interesting blog articles, sponsored links. And you can claim that paid links are wrong just as Rand Fishkin does. Still, there are really competitive niches where it’s just impossible to get enough relevant natural links – casino is an example. All competitors use gray/black hat and you are forced to do the same. We are monitoring casino SERPs for years – there’s only couple of sites (out of hundreds) that use natural link building. One remark though: they all are more than 10 years old.

How to get links in competitive niches

1. On-the-budget techniques

Options

Pros

Cons

  1. web2.0 links
  2. bulk blog comments
  3. forum profiles
  4. wordpress theme footers
  5. hacked sites
  6. etc.

 

  1. very cheap (permanent link for $0.1-10)
  2. very fast (less than 1 month)
  3. easy outsourceable (a lot of freelancers/companies provide such services)

That always was a major target of Google’s webspam team. If they still work, it’s just a bug for Google which they will fix very soon.

Read LinkResearchTools article on how WilliamHill was penalized.

Conclusion: Cheap techniques should not be used directly for linking to the long-term projects.

2. Buy high-quality relevant / irrelevant links

Options

Pros

Cons

  1. Good guest posts
  2. In-content page links (forget about footers, sidebars, sitewide links)
  1. affordable (in casino niche 1 good link from PR2+ costs $150-500 per year)
  2. fast (1-6 months)
  3. outsourceable (if you agree to pay double price of course, as trustful mediators may be greedy)
  4. if done right you can stay in TOP 10 for a long time (we track SERPs and most of top-ranked sites use paid links)

You don’t control linking sites:

  1. Not agile: you want to change anchors because of Penguin 7.0 but webmaster doesn’t reply to your e-mails
  2. A lot of fraud:

    some middlemen pretend to be webmasters, take money from you for a year price, pay monthly price to webmaster and disappear

Need to monitor sites daily in order to:

  1. Keep a good neighbourhood: you don’t want to be posted close to “cheap viagra” link or at a page with 30 outgoing links nearby
  2. Source sites may be penalized
  3. Sites may be not working for weeks because for webmaster it might be not that important

Conclusion: Often worth the costs yet you don’t have any competitive advantage – competitors can see in Majestic where you buy ads and buy there too. Sometimes you just cannot find relevant links and are forced to buy irrelevant ones – they have less value and may dilute a site topic.

3. Build high quality relevant links

Options

Pros

Cons

  1. Own sites
  2. Own blogs
  1. competitive advantage
  2. complete control
  3. cheaper than bought links in long-term perspective
  4. additional ways to build links: exchange links
  5. additional relevant traffic
  1. need a proven way to make many high-quality Panda-proof sites
  2. need to support sites: add content, buy hostings
  3. need to make sure that nobody can connect your sites
  4. need to find ways to get many links to these sites

Conclusion: If you don’t make your own sites yet you should at least think about it. It’s very tempting – but you have to do it right.

What to choose?

Option

Price

Speed

Quality

Control / Agility

Risks

Buy cheap links

low

1 month

low

low

high

Buy relevant links

high

1-3 months

high

average

AVG

Build own relevant links

AVG

3-6 months

high

high

low

We recommend to combine 2nd and 3rd options:

  1. Stop buying low quality links immediately
  2. Start or continue buying high quality relevant links but choose partners carefully
  3. Make your own sites linking to your important sites to reduce risks; use them also for link exchange, reducing the budget for buying links

Creating hundreds of sites: how to make it wrong

Our first sites used automatically synonymized content. Links from our 20 relevant sites promoted our important site to #3 in “online blackjack” SERP for 6 months, same with other casino keywords. Unfortunately, these days are gone. You need to create a readable content and think about security because Google’s algorithms become more sophisticated every year.

Using WordPress or other widespread CMS is a bad idea

That’s the first thing that comes into SEO’s mind. Many SEO gurus will tell you how to use WordPress for SEO. Still, if you want to make more than 10 sites – don’t invest your time and money into it.

If Google can detect that most of sites linking to you use the same CMS (like WordPress) – it’s not a natural pattern so it’s a good reason to penalize the site.

Here are some ideas on how Google can detect WordPress:

  1. Inline text
    1. Powered by WordPress
    2. <meta name=”generator” content=”WordPress 3.8.3″ />
    3. <!– This site is optimized with the Yoast WordPress SEO plugin

      <!– Performance optimized by W3 Total Cache

  2. Source files in the same directories
    1. Images, CSS, JS in /wp-content
    2. Links to /wp-includes
  3. Existing URLs
    1. /wp-admin (shows login page) and /wp-login.php
    2. /xmlrpc.php (shows “XML-RPC server accepts POST requests only.”)
  4. RSS Feed format
    1. <generator>http://wordpress.org/?v=3.8.3</generator>

      (that’s my favorite because all forget about it; Google sees that all your linking sites have the same WordPress version, all are updated the same day – not suspicious at all)

Considering the fact that 10% of all the sites are using WordPress, Google obviously has a WordPress detection algorithm in the ranking formula and updates it regularly. If 50% of your links are from WordPress sites you may be penalized soon.

Same goes with all other popular CMS: Joomla, Drupal and even frameworks like Symfony, CakePHP. The common rule is to use technologies that most webmasters use (PHP is more preferable than Java), or those that are used by less than 1%. Google is smart enough to detect widespread technologies. It will notice that you use PHP (as most sites use it) and having all the sites linking to you built on PHP won’t be an issue. At the same time, WordPress is used only by 10% of webmasters. Therefore, you’re unlikely to wish Google recognize that all the sites you’re being linked to are built on WordPress.

It’s better if your CMS is not open-sourced: in such case, it is much harder for Google and people find connections between your sites.

Fingerprints in custom CMSs

The first thing you should remember – “NO FINGERPRINTS”. If there is something same in all your sites then Googlebot will find it; if not – your competitors will find and send to Google team. Here are some ideas on what you can do wrong:

  1. Tech stuff:
    1. Same IP or C-class network (11.22.33.44 and 11.22.33.45)
    2. Same NS-servers (IrishWonder has article how Interflora got penalized)
    3. Same WHOIS
    4. Same domain registrar
  2. Nearby code
    1. Google Analytics (UA-1043770-1, UA-1043770-2, …)
    2. Google AdSense
  3. Same code
    1. Your own statistics code
    2. Your banner management system
    3. Same code in header/footer
  4. Paranoid
    1. Log in to Google services (Google Analytics, Google Webmaster Tools) from the same computer
    2. Visit many of your sites in Google-controlled software in one session: Google Chrome, Browser with Google Toolbar
    3. Find many of your sites in Google in one session: “site:example.com”

It’s hard to go too paranoid in this matter. Check everything so there is nothing common:

  1. HTML code
  2. Scripts code (own and 3rd party like Google Analytics)
  3. Filenames
  4. URL structure
  5. Server headers
  6. Outgoing links format
  7. Same unusual robots.txt format
  8. etc.

Find a good hacker and ask him to point out what’s in common for the given sites. Give him several of your sites and some of competitors, give a task to realize which ones belong to the same owner.

Hundreds of sites: what we did

Content & Design

Content should be cheap yet unique and readable. Make sure that checks for duplicated content are either a part of your business process, or it’s automatically integrated in your CMS.

It’s a bad idea to get a free/paid WordPress template and use it. Google knows it doesn’t take much effort to create your site. It’s easy for Google to replace all content blocks with “Lorem Ipsum…” and compare screenshots. Consequently – yes, look also matters,and synonymizing <div> classes is not enough.

Many CMSs make creating design templates overcomplicated. Make sure that it takes no more than a day per site to created – and you should be guaranteed to get a unique design.

Support

Things you should do:

  1. Track all domain and hostings information:
    1. When domains/hostings expire
    2. Which domain at which domain registrar, hosting, identity (WHOIS)
    3. What are the contact details, login/passwords, secret questions for each registrar and hosting
    4. What is IP (track if it’s changed; don’t buy hostings nearby)
  2. Check if your sites are live 99% uptime means that 3 days in the year your sites will be down; if you have 100 sites than in average each day some of your sites will be broken and you need to fix it or move to other hosting as soon as possible
  3. Track and check all external links. If you have 10-50 sites, you can still use Excel. Otherwise, find a more automated solution.

We have only 1-2 sites at the same hosting. It’s your choice to decide how many sites are allocated to a hosting.

Also, don’t register all domains under the same registrar. It’s too suspicious if your sites have links only from GoDaddy sites.

Link placement

For example, you have 100 sites linking to your 5 important sites. You have decided to publish 3 links from each homepage. In total you will have 300 links – that means 60 links for each important site.

You should post links not only to your sites but also to other trustworthy sites in your niche (even to your competitors) to make it look more natural. Let’s say you have decided to make 4 additional links from homepages and 7 additional links from inner pages to other sites. That comes up to 1300 links.

You can find relevant sites and ask them for link exchange. This is how you get 1300 links from other sites. That is 5 times more than from your sites only and is less risky because it’s looking more natural.

Tip: Always make a noticeable “Contact Us” link from the homepage so that people who want to exchange links could contact you.

Get a good software to track links because:

  1. You want to link only to live sites (no broken links)
  2. If your link exchange partner removed your link, you should know about it in the same day

Usually 2 programs suffice: CRM and link checker, despite it would be nice to have them all integrated.

Budget

That really depends on your needs. We have several kind of sites: simple ones (15 pages) costing around $300 per site and more advanced (30 pages, better design and content) – $600 per site.

Therefore, our budget for creating 100 sites is $30,000 to $70 000. As we’ve calculated, you can get 1600 links from those sites (300 links from sites directly and 1300 using link exchange). That’s $20-40 per permanent link. Hope you expect your sites to live at least 3-5 years so you can split expenses between several years – estimated expenses are $4-13 per link per year. That comes up to a much lower price than buying links from other sites on one hand ($150-500 per year), and you can be completely sure of the quality on the other hand.

Of course you should add:

  1. Support costs: domains, hostings, maintenance
  2. Linkbuilding price for these sites (cheap ways can be used here)

Automate everything

The catch is that you need to have the process and software to fit in the budget described. It may take from 6 month to a couple of years if you decide to develop it by yourself. Still, safe future is more important, isn’t it?

CMS features you may need:

  1. Backup. Hostings can be down and sometimes you lose all access. We always have the latest content and some real-time data like contact us forms, subscriptions, polls, visitor statistics are collected each 3 hours
  2. Easy migration. If some hosting becomes slow or not working at all, you might want to move your site to another hosting. This should be a very easy process. It should take minutes, not hours to transfer site from one hosting to another, and it should be simple to configure a site at a new hosting.
  3. Checking site availability. Pingdom will cost you a fortune if you have hundreds of sites to check. Still, if your sites are down this may eat up a part of your budget that exceeds your costs on Pingdom or similar services. We have developed our own system for that because we needed additional information: which hosting is used now and was before, how important is the site. Also, we needed to detect some errors that Pingom considers acceptable (visible PHP-code, missing </html>, etc).
  4. Easy learning curve
    1. HTML-developers. Use templating system that allows you to copy other site design and slightly modify it. If they spend less than a day per site and your sites don’t share same HTML – that’s enough for a working model.
    2. Copywriters. Make sure adding, modifying and uploading a page takes seconds, not minutes. Also, the process should be simple: your copywriter doesn’t have to spend a month on puzzling out your CMS.
  5. Automated error check. There is a lot of typical mistakes like unclosed tag. It’s not hard to check them automatically.
  6. Content history. If copywriter has accidentally removed something important that should not a problem.
  7. Automatic randomization. Even outgoing links to affiliate partners should have different format.
  8. Access control. Copywriters, HTML-developers and administrators should have different access levels
  9. Multi-user. If 2 copywriters decide to edit the same page the same time CMS should not allow that or at least notify.

Conclusion

It’s tough to do natural linkbuilding in competitive niches. Thus, you should make it look as natural as possible. It’s a good time to stop using low quality links and raise the bar even for relevant links. If you start making your own sites now you will be prepared to the next Google updates and will have a competitive advantage on top.

There are a lot of issues with public CMS. Subsequently, you may need to develop custom solution. Development of a CMS and several hundreds sites may cost $300,000-$500,000. Still, it will pay off even in current conditions. If Google continues to tighten the screws, it may be the only way to survive.

Featured image is used under Creative Commons License

Google Payday Loan Algo Punishes Spammy Search Terms, Except On Youtube

Recently I put together an article about press release sites taking a huge hit in search rankings, presumably due to the “payday loan” algorithm which is supposed to target highly spammed keywords and sites using spammy techniques.

I spoke to an employee of a press release distribution company (who both will remain nameless) and they told me that the initial punishment occurred over the keyword “garcinia cambogia”, a keyword that gets more than 800,000 searches per month according to Semrush.

As I continued to write I decided to do a search for that keyword and see who the new results were. To my surprise I found a short YouTube video ranking near the bottom of page 1. After doing some research on the video I examined its backlink profile and came to the conclusion that the site was ranking purely on the strength of pure spam.

This discovery got me thinking that perhaps YouTube, a Google owned property might be “protected” from such actions. After all the more traffic their videos receive, the more revenue they can generate through ads.

I decided to check out some other keywords to see if my theory held true in another niche. After some consideration I decided to focus on a local seo keyword, such as “city name seo”. I wanted a term that would have value and a term that would have some good search volume.

The keyword I settled on has roughly 500 searches per month for its city name “seo” and could potentially generate a few hundred more visits by ranking for other variations of this same keyword.

Lo and behold I was able to find a YouTube video ranking in the 6th position for this keyword.

Video rank

Well, if it is ranking in the top 10 and Google is attacking spammy backlinks, then this must be a squeaky clean white hat video correct?

Think again!

The video has 65 views yet it has 1700 backlinks from almost 300 domains. How does that happen? How can only 65 people viewed the video yet 1700 links been created for said video? Perhaps the links are quality, so let’s take a peek!

After checking the backlink profile on Majectic SEO most of the links are coming via blog comments. Wait a minute, blog comments can be white hat right?

Majestic Example

Of course they can but when the anchor text is either exact match or some variation of the main keyword then it screams spam. Don’t take my word for it, take a look yourself!

Link Example

Notice that this page has been spammed to death and has some unsavory keywords on the same page as the “seo” keyword. I have marked out most information since I just want to point out the facts but do not want to “out” the video in question.

I think it is pretty clear that the site is simply using YouTube as a “host” to spam and rank.

In light of how Google has handled some news sites and the press release distribution sites I find it rather interesting that they are punishing these domains in the name of “search quality” yet their very own property can be used to rank for some of these keywords using the shadiest of tactics with no ill effects?

What are your thoughts?

Managing Your Online Reputation On a Higher Level

There are a number of factors that are important to consider when trying to solidify your branding recognition and overall PR. But when it comes to your presence on the internet, nothing is quite as important as managing your online reputation. It is more crucial than visibility, social media engagement, or even online sales.

Your image online and how people perceive and so speak about your brand can make or break your brand. If you have a positive online reputation, you will find things like social media campaigning, direct B2C advertising and even overall visibility to be significantly more positive.

How You Maintain Your Online Reputation

Brandseye

A number of avenues exist for managing your reputation on the web. For the most part, the process is straight forward. You can have an active reputation management campaign running within a few hours. A few easy methods include:

  • Use a wide variety of tracking tools. The most basic you should be using is Google Alerts. You can sign up to have reports regularly sent to your email every time your brand is mentioned. Other helpful trackers include Brandseye, Rankur, and SocialMention.
  • Have a dedicated social media team. If you have an established business, you want to keep up with social media as much as possible. Not just to post regular updates, but to monitor what people are saying about you. If there is a problem you will want to have it taken care of as quickly as possible. But you want to react to praise, as well. If you don’t have employees you can put on this, you might want to hire a freelancer or two that can take care of your profiles.
  • Buy your domain name (and similar domains). It is worth it to buy up your closest domain matches to your brand. This is important for reputation management because others can use a similar domain for unsavory things. Even if they use it for legitimate business reasons, you will not have any control over the actions of their company, and it takes away from visibility for your real brand.
  • Start better targeting your social media interactions. Not all social media platforms are going to be the right ones for you. Track your data and how much use you have gotten from each. Then narrow down a more specific strategy for not just monitoring, but also find tuning your online presence. Your main beats to choose from will be Twitter, Facebook, YouTube, LinkedIn, Pinterest, Google+ and Instagram. Each provides something different, and knowing which work best for you will help solidify a more impressive strategy.
  • Start working that PR. Is there an event going on in your industry that is taking sponsors? A charity that you especially believe in? A scholarship program your community is taking donations for? These are all opportunities. The adage that all publicity is good publicity isn’t true on the web. You want to seek out chances for good PR that you can share socially.

Taking It To a Higher Level (Tools!)

License Direct

The above tactics are the more basic methods that should start you off. But once you get those underway, you should be turning to more advanced tools and tips for your online reputation management.

  • Website security verification is something to start with. It’s not a one-time task, since you need to keep an eye on user security ratings as well.
  • Make sure your brand is registered as trusted. People should be able to check you out quickly, and know you are trustworthy and legitimate. There are lots of sites that provide that service (example). A database like LicenseDirect will let you register your company, so people can search out details like verifying credentials, seeing that your professionals are properly licensed, and see a full profile for you or your company that is more official than LinkedIn.
  • Social media verification is another step towards higher-quality trust signals.

Do you have any tips for managing your online reputation? Let us know in the comments.

Featured image: approved