Google removing author pictures from search: Your input?

Last week Google announced removing author pictures from search results while keeping the author name. Seeing author pictures within search results was a huge competitive advantage, so no wonder this step was criticized by many authors who were participating in Google Authorship feature.

From the good news: Participating is Authorship has been easier…

  • If previously you could never be sure if your author markup will make it to the SERPs, now all you need is to have your authorship correctly set-up (which may be a bad thing too as, let’s face it, it’s easier to have for anyone now)
  • If previously you could only have ONE authorship snippet per SERPs, now you’ll all of them (if several of your articles have been ranked, all of them will have your name)

I have been discussing this issue around the web and have collected some opinions. My Google Plus thread has lots of great insights, please check it out:


 

 I especially liked this one from Shelly Cihan:

I support the removal. Knowing what a person looks like should not impact whether or not you click on a result in the SERP. It accentuated an already too vain society.

[Hard to disagree: Having an advantage in SERPs because your headshot looks nice doesn't seem fair at all!]

I have also collected some opinions from MyBlogU below:

Our interviewees were answering the following questions:

Let’s see what they think:

Q. Do you believe Google has done that to optimize for mobile devices? Why not? :)

David FaltzA. David Faltz (Founder White Rabbit Marketing. Search Engine & Branding Optimization (SEBO) Marketer)

I do believe that mobile probably did play some part in their decision to remove author images, but that is not the whole story for sure. They have been toying with author images for while now, and they have not gotten people to conform as they wanted. With low adoption rates by what Google would consider “real authors,”  and more people using it as a marketing tactic to stand out from he crowd, Google decided “enough was enough!” 

Swayam DasA. Swayam Das (Social Media Marketer)

Umm.. I really don’t think so ! Google always has a reasonable logic working behind their each and every move. So I’ll just wait and see how things work out on the mobile space! Mobile searches results tend to be location oriented so I don’t see much of a movement without any Authorship pics.

Marc NashaatA. Marc Nashaat (Enterprise Marketing Consultant)

No, that’s not very likely. Google uses device detection to decide whether to serve up their mobile layout vs. desktop and they could just as easily style mobile to exclude authorship snippets. I don’t think it’s a matter of consistency as Google has been preaching the importance of different user experiences for mobile vs desktop for years now. 

Paul ShapiroA. Paul Shapiro (SEO Director at Catalyst)

I was a bit baffled at the decision to remove the author images from the SERP. I was a found believer that when Vic Gundotra left Google, it was not the end of Google+.

This change however, had me second guessing the future of the platform. Surely, the author images were a HUGE incentive for Google+ usage. Why in the world would they choose to remove one of it’s most significant features?

I have a number of theories beyond the typical answer of it helping pretify the SERP or creating a better mobile search experience:

  1. Maybe it was negatively affecting AdWords CTR.
  2. Google wants more eyes on knowledge graph.
  3. Now that x number of people are using authorship, they care less about incentivizing it’s use or perhaps it started to lead to spammy usage.
  4. It detracted from the CTR of the ranking algorithm. Shouldn’t position 1 get more clicks than position 2? What if it weren’t the case due to an author image?
  5. Google wants to push personalized searches even more and the inclusion of images in those searches actually detracted from this. People would click on personalized search results much too often compared to regular results. They want them to be “blind” to it, by making it visually more integrated.
  6. Google is making big changes to Google+ and how it is integrated with other Google products. There are more big changes coming! 

Dave RekucA. Dave Rekuc (Marketing Director)

Probably not, if it were a mobile only difference, Google would only roll the change out to mobile devices, they’re smart enough not to treat their entire search audience as one unit.  I think what’s happened is a feature with good intentions wound up driving results that didn’t actually favor a better search experience, plain and simple.  Mediocre articles with author mark-up caught the eye in search results and good sites that were ignoring the mark-up got passed up.

I’m sure there are 1,001 conspiracy theories that believe that Google rolled out such strong authorship mark-up in their SERPS to lure contributors to Google +.  Totally possible, completely unprovable.  Whether it did or didn’t I think it’s fair to assume that Google + is here to stay and that ignoring authorship mark-up, even after losing the author’s image, is a fool’s errand.  We know the web is getting more social and we know Google is paying attention now, it’s easy to implement, I can’t see why an author should ignore it.

Q. Do you believe @JohnMu that will not affect click-through? Why not? :)

David FaltzA. David Faltz (Founder White Rabbit Marketing. Search Engine & Branding Optimization (SEBO) Marketer)

Absolutely not! Google is always trying to convince us they are not the big bad corporation, whose interests are aligned with ours. Though I respect John Mueller, I do believe this is just PR. There has been all kinds of testing done by 3rd parties already, that already confirmed author images increase CTR. How could it not have?! It was a fantastic equalizer in terms putting less emphasis on where you ranked on any particular SERP. 

Swayam DasA. Swayam Das (Social Media Marketer)

I do not believe in the fact that CTRs won’t be affected. Primarily because if I place myself in the Searcher’s position I would definitely click on results that had images beside them. To my eyes they serve as a signal of being genuine,  someone that holds authority.  For example, if I search for “diet pills” and amongst the 10 results I see a doctor’s pic beside a site then I’ll definitely click on that ignoring others. The reason is for a normal user he/she won’t be knowing which is an authority site.

Marc NashaatA. Marc Nashaat (Enterprise Marketing Consultant)

Not particularly, putting aside the case studies, common sense tells us that a result with an image is going to stand out more than a plain text result. When things stand out, they get more attention. Pretty simple. I’m also curious what these observations were based on; whether they were SERPs where all (or most) listings had authorship images. If so, it’s possible that you wouldn’t see significantly higher CTR’s than on a SERP with all plain text listings. 

It’s hard to come up with alterior motives for Google on this front, maybe they’ve found that authorship detracts from ad clicks, but that’s just entirely speculation. 

Paul ShapiroA. Paul Shapiro (SEO Director at Catalyst)

The first thing I thought when I heard John Mueller say that the removal of author images in the SERP wouldn’t affect click-through rate was “Okay, that’s easy enough to test”. I doubted that Google would want to make a false claim about something that is so easily tested. Someone will release a study on this subject and we’ll know the truth soon enough.

Dave RekucA. Dave Rekuc (Marketing Director)

I don’t believe that even a little bit.  On a relatively clean search results page, you’re going to tell me that an author’s image doesn’t catch the eye?  In eye tracking studies, human faces come up all the time as one of the first places the eye goes.  We’re definitely going to see a drop in CTR on our articles.  Everyone is losing the article picture at the same time and that may soften the blow, but not every search result contained the mark-up and that’s where we lose our competitive advantage.

Q. Please share what you feel about that? Will you still care to verify your content after this change?

David FaltzA. David Faltz (Founder White Rabbit Marketing. Search Engine & Branding Optimization (SEBO) Marketer)

Setting up authorship is not really not complicated, and less so if you are working with Worpdress. There are plenty of plugins that make it even easier to implement. I would imagine it will affect adoption and participation rates moving forward. I think for the most part author verification has been a failed experiment that has mostly been used by internet marketers. Google knows that and wants to take away yet another edge from us ;) G+ make be next! lol

Anna FoxA. Anna Fox (Blogger)

Google seems to be still showing up pictures in personalized results: Which means you need to seriously work on your G+ following!

The big news for personalized (logged in to your Google account) search is that _author photos may still show for Google+ posts by people you have in your circles. (h/t to +Joshua Berg). Every other authorship result now looks just like those in the logged out search example.

Swayam DasA. Swayam Das (Social Media Marketer)

This move by Google kind of coincides with the recent Google+ update! Personally I was wondering if this move is directly signalling a cancellation of Google Authorship in the near future. If that is so then I won’t be verifying my content. Has Google just removed author pics from search results or the entire authorship program? Depends!

Marc NashaatA. Marc Nashaat (Enterprise Marketing Consultant)

I don’t agree with the change, but I’ve learned to adapt to the whims of Google. I will definitely still be using authorship markup. If you believe in the future of the knowledge graph, there’s no reason not to. At the very least you’re creating structured data for your content, and that’s never a bad thing. 

Paul ShapiroA. Paul Shapiro (SEO Director at Catalyst)

I’m going to continue to apply authorship to all of my writing. It still gives me a sense of ownership (especially within search) beyond a simple byline. I also think there are advantages beyond the author image. People can click to see other things I’ve written write within the SERP. It affects personalized search results (probably more important than author images honestly), and it open a world of future benefits in semantic search and the possibility of agent rank, should it ever be used beyond indepth articles (which is also a benefit).

My gut is telling me this isn’t the end of Google+, but rather one change of many to come in how Google will interacts with Google+ and how the Google+ team functions as an organization. Interesting times are ahead of us.

Dave RekucA. Dave Rekuc (Marketing Director)

I honestly think it’s crazy to consider not verifying your content just because the short-term benefit of the author’s image has disappeared.  Google has proven a commitment to making Google + work and to making it’s search results more personalized.  They’ve created a way to structure your contributions across the web and personally build an authority that transcends domains.  I think any content creator would still be foolish to ignore authorship at this point.

Now, what’s your input?

Long-term SEO in Competitive Niches: How We Survived all Google Updates

[Disclaimer: Contributors' views are their own. They do not necessarily represent Devshed views ]

Barry Schwartz has listed the most competitive niches in SEO: gambling, mortgage, hosting, real estate, travel, etc. We are into grey/black-hat SEO in one of these niches for 7 years already. Our sites have been in TOP10 by “online casino/slots/blackjack/…” and still remain there by less competitive but high ROI keywords. We have started with black hat – still, we’ve invested much into long-term SEO as it was obvious that Google will be improving its algorithms. Most of the sites where we applied a long-term strategy were not hit either by Google Penguin or by Google Panda updates.

Famous Moz Search Engine Ranking Factors Survey investigated on the weight of top SEO ranking factors in Google: 40% – links, 31% – content, 8% – brand, 8% – user/usage/query data, 7% – social, 6% – other. At the same time, in really competitive niches content and user/usage/query data are not an issue – you have already done everything possible by default – just because all of your competitors are doing this. Thus, sites with good content are competing for influence by means of a backlink profile.

If you have a “Dentist Eaton Colorado 7th Street” site you may use natural link building: local business directories, interesting blog articles, sponsored links. And you can claim that paid links are wrong just as Rand Fishkin does. Still, there are really competitive niches where it’s just impossible to get enough relevant natural links – casino is an example. All competitors use gray/black hat and you are forced to do the same. We are monitoring casino SERPs for years – there’s only couple of sites (out of hundreds) that use natural link building. One remark though: they all are more than 10 years old.

How to get links in competitive niches

1. On-the-budget techniques

Options

Pros

Cons

  1. web2.0 links
  2. bulk blog comments
  3. forum profiles
  4. wordpress theme footers
  5. hacked sites
  6. etc.

 

  1. very cheap (permanent link for $0.1-10)
  2. very fast (less than 1 month)
  3. easy outsourceable (a lot of freelancers/companies provide such services)

That always was a major target of Google’s webspam team. If they still work, it’s just a bug for Google which they will fix very soon.

Read LinkResearchTools article on how WilliamHill was penalized.

Conclusion: Cheap techniques should not be used directly for linking to the long-term projects.

2. Buy high-quality relevant / irrelevant links

Options

Pros

Cons

  1. Good guest posts
  2. In-content page links (forget about footers, sidebars, sitewide links)
  1. affordable (in casino niche 1 good link from PR2+ costs $150-500 per year)
  2. fast (1-6 months)
  3. outsourceable (if you agree to pay double price of course, as trustful mediators may be greedy)
  4. if done right you can stay in TOP 10 for a long time (we track SERPs and most of top-ranked sites use paid links)

You don’t control linking sites:

  1. Not agile: you want to change anchors because of Penguin 7.0 but webmaster doesn’t reply to your e-mails
  2. A lot of fraud:

    some middlemen pretend to be webmasters, take money from you for a year price, pay monthly price to webmaster and disappear

Need to monitor sites daily in order to:

  1. Keep a good neighbourhood: you don’t want to be posted close to “cheap viagra” link or at a page with 30 outgoing links nearby
  2. Source sites may be penalized
  3. Sites may be not working for weeks because for webmaster it might be not that important

Conclusion: Often worth the costs yet you don’t have any competitive advantage – competitors can see in Majestic where you buy ads and buy there too. Sometimes you just cannot find relevant links and are forced to buy irrelevant ones – they have less value and may dilute a site topic.

3. Build high quality relevant links

Options

Pros

Cons

  1. Own sites
  2. Own blogs
  1. competitive advantage
  2. complete control
  3. cheaper than bought links in long-term perspective
  4. additional ways to build links: exchange links
  5. additional relevant traffic
  1. need a proven way to make many high-quality Panda-proof sites
  2. need to support sites: add content, buy hostings
  3. need to make sure that nobody can connect your sites
  4. need to find ways to get many links to these sites

Conclusion: If you don’t make your own sites yet you should at least think about it. It’s very tempting – but you have to do it right.

What to choose?

Option

Price

Speed

Quality

Control / Agility

Risks

Buy cheap links

low

1 month

low

low

high

Buy relevant links

high

1-3 months

high

average

AVG

Build own relevant links

AVG

3-6 months

high

high

low

We recommend to combine 2nd and 3rd options:

  1. Stop buying low quality links immediately
  2. Start or continue buying high quality relevant links but choose partners carefully
  3. Make your own sites linking to your important sites to reduce risks; use them also for link exchange, reducing the budget for buying links

Creating hundreds of sites: how to make it wrong

Our first sites used automatically synonymized content. Links from our 20 relevant sites promoted our important site to #3 in “online blackjack” SERP for 6 months, same with other casino keywords. Unfortunately, these days are gone. You need to create a readable content and think about security because Google’s algorithms become more sophisticated every year.

Using WordPress or other widespread CMS is a bad idea

That’s the first thing that comes into SEO’s mind. Many SEO gurus will tell you how to use WordPress for SEO. Still, if you want to make more than 10 sites – don’t invest your time and money into it.

If Google can detect that most of sites linking to you use the same CMS (like WordPress) – it’s not a natural pattern so it’s a good reason to penalize the site.

Here are some ideas on how Google can detect WordPress:

  1. Inline text
    1. Powered by WordPress
    2. <meta name=”generator” content=”WordPress 3.8.3″ />
    3. <!– This site is optimized with the Yoast WordPress SEO plugin

      <!– Performance optimized by W3 Total Cache

  2. Source files in the same directories
    1. Images, CSS, JS in /wp-content
    2. Links to /wp-includes
  3. Existing URLs
    1. /wp-admin (shows login page) and /wp-login.php
    2. /xmlrpc.php (shows “XML-RPC server accepts POST requests only.”)
  4. RSS Feed format
    1. <generator>http://wordpress.org/?v=3.8.3</generator>

      (that’s my favorite because all forget about it; Google sees that all your linking sites have the same WordPress version, all are updated the same day – not suspicious at all)

Considering the fact that 10% of all the sites are using WordPress, Google obviously has a WordPress detection algorithm in the ranking formula and updates it regularly. If 50% of your links are from WordPress sites you may be penalized soon.

Same goes with all other popular CMS: Joomla, Drupal and even frameworks like Symfony, CakePHP. The common rule is to use technologies that most webmasters use (PHP is more preferable than Java), or those that are used by less than 1%. Google is smart enough to detect widespread technologies. It will notice that you use PHP (as most sites use it) and having all the sites linking to you built on PHP won’t be an issue. At the same time, WordPress is used only by 10% of webmasters. Therefore, you’re unlikely to wish Google recognize that all the sites you’re being linked to are built on WordPress.

It’s better if your CMS is not open-sourced: in such case, it is much harder for Google and people find connections between your sites.

Fingerprints in custom CMSs

The first thing you should remember – “NO FINGERPRINTS”. If there is something same in all your sites then Googlebot will find it; if not – your competitors will find and send to Google team. Here are some ideas on what you can do wrong:

  1. Tech stuff:
    1. Same IP or C-class network (11.22.33.44 and 11.22.33.45)
    2. Same NS-servers (IrishWonder has article how Interflora got penalized)
    3. Same WHOIS
    4. Same domain registrar
  2. Nearby code
    1. Google Analytics (UA-1043770-1, UA-1043770-2, …)
    2. Google AdSense
  3. Same code
    1. Your own statistics code
    2. Your banner management system
    3. Same code in header/footer
  4. Paranoid
    1. Log in to Google services (Google Analytics, Google Webmaster Tools) from the same computer
    2. Visit many of your sites in Google-controlled software in one session: Google Chrome, Browser with Google Toolbar
    3. Find many of your sites in Google in one session: “site:example.com”

It’s hard to go too paranoid in this matter. Check everything so there is nothing common:

  1. HTML code
  2. Scripts code (own and 3rd party like Google Analytics)
  3. Filenames
  4. URL structure
  5. Server headers
  6. Outgoing links format
  7. Same unusual robots.txt format
  8. etc.

Find a good hacker and ask him to point out what’s in common for the given sites. Give him several of your sites and some of competitors, give a task to realize which ones belong to the same owner.

Hundreds of sites: what we did

Content & Design

Content should be cheap yet unique and readable. Make sure that checks for duplicated content are either a part of your business process, or it’s automatically integrated in your CMS.

It’s a bad idea to get a free/paid WordPress template and use it. Google knows it doesn’t take much effort to create your site. It’s easy for Google to replace all content blocks with “Lorem Ipsum…” and compare screenshots. Consequently – yes, look also matters,and synonymizing <div> classes is not enough.

Many CMSs make creating design templates overcomplicated. Make sure that it takes no more than a day per site to created – and you should be guaranteed to get a unique design.

Support

Things you should do:

  1. Track all domain and hostings information:
    1. When domains/hostings expire
    2. Which domain at which domain registrar, hosting, identity (WHOIS)
    3. What are the contact details, login/passwords, secret questions for each registrar and hosting
    4. What is IP (track if it’s changed; don’t buy hostings nearby)
  2. Check if your sites are live 99% uptime means that 3 days in the year your sites will be down; if you have 100 sites than in average each day some of your sites will be broken and you need to fix it or move to other hosting as soon as possible
  3. Track and check all external links. If you have 10-50 sites, you can still use Excel. Otherwise, find a more automated solution.

We have only 1-2 sites at the same hosting. It’s your choice to decide how many sites are allocated to a hosting.

Also, don’t register all domains under the same registrar. It’s too suspicious if your sites have links only from GoDaddy sites.

Link placement

For example, you have 100 sites linking to your 5 important sites. You have decided to publish 3 links from each homepage. In total you will have 300 links – that means 60 links for each important site.

You should post links not only to your sites but also to other trustworthy sites in your niche (even to your competitors) to make it look more natural. Let’s say you have decided to make 4 additional links from homepages and 7 additional links from inner pages to other sites. That comes up to 1300 links.

You can find relevant sites and ask them for link exchange. This is how you get 1300 links from other sites. That is 5 times more than from your sites only and is less risky because it’s looking more natural.

Tip: Always make a noticeable “Contact Us” link from the homepage so that people who want to exchange links could contact you.

Get a good software to track links because:

  1. You want to link only to live sites (no broken links)
  2. If your link exchange partner removed your link, you should know about it in the same day

Usually 2 programs suffice: CRM and link checker, despite it would be nice to have them all integrated.

Budget

That really depends on your needs. We have several kind of sites: simple ones (15 pages) costing around $300 per site and more advanced (30 pages, better design and content) – $600 per site.

Therefore, our budget for creating 100 sites is $30,000 to $70 000. As we’ve calculated, you can get 1600 links from those sites (300 links from sites directly and 1300 using link exchange). That’s $20-40 per permanent link. Hope you expect your sites to live at least 3-5 years so you can split expenses between several years – estimated expenses are $4-13 per link per year. That comes up to a much lower price than buying links from other sites on one hand ($150-500 per year), and you can be completely sure of the quality on the other hand.

Of course you should add:

  1. Support costs: domains, hostings, maintenance
  2. Linkbuilding price for these sites (cheap ways can be used here)

Automate everything

The catch is that you need to have the process and software to fit in the budget described. It may take from 6 month to a couple of years if you decide to develop it by yourself. Still, safe future is more important, isn’t it?

CMS features you may need:

  1. Backup. Hostings can be down and sometimes you lose all access. We always have the latest content and some real-time data like contact us forms, subscriptions, polls, visitor statistics are collected each 3 hours
  2. Easy migration. If some hosting becomes slow or not working at all, you might want to move your site to another hosting. This should be a very easy process. It should take minutes, not hours to transfer site from one hosting to another, and it should be simple to configure a site at a new hosting.
  3. Checking site availability. Pingdom will cost you a fortune if you have hundreds of sites to check. Still, if your sites are down this may eat up a part of your budget that exceeds your costs on Pingdom or similar services. We have developed our own system for that because we needed additional information: which hosting is used now and was before, how important is the site. Also, we needed to detect some errors that Pingom considers acceptable (visible PHP-code, missing </html>, etc).
  4. Easy learning curve
    1. HTML-developers. Use templating system that allows you to copy other site design and slightly modify it. If they spend less than a day per site and your sites don’t share same HTML – that’s enough for a working model.
    2. Copywriters. Make sure adding, modifying and uploading a page takes seconds, not minutes. Also, the process should be simple: your copywriter doesn’t have to spend a month on puzzling out your CMS.
  5. Automated error check. There is a lot of typical mistakes like unclosed tag. It’s not hard to check them automatically.
  6. Content history. If copywriter has accidentally removed something important that should not a problem.
  7. Automatic randomization. Even outgoing links to affiliate partners should have different format.
  8. Access control. Copywriters, HTML-developers and administrators should have different access levels
  9. Multi-user. If 2 copywriters decide to edit the same page the same time CMS should not allow that or at least notify.

Conclusion

It’s tough to do natural linkbuilding in competitive niches. Thus, you should make it look as natural as possible. It’s a good time to stop using low quality links and raise the bar even for relevant links. If you start making your own sites now you will be prepared to the next Google updates and will have a competitive advantage on top.

There are a lot of issues with public CMS. Subsequently, you may need to develop custom solution. Development of a CMS and several hundreds sites may cost $300,000-$500,000. Still, it will pay off even in current conditions. If Google continues to tighten the screws, it may be the only way to survive.

Featured image is used under Creative Commons License

Google Payday Loan Algo Punishes Spammy Search Terms, Except On Youtube

Recently I put together an article about press release sites taking a huge hit in search rankings, presumably due to the “payday loan” algorithm which is supposed to target highly spammed keywords and sites using spammy techniques.

I spoke to an employee of a press release distribution company (who both will remain nameless) and they told me that the initial punishment occurred over the keyword “garcinia cambogia”, a keyword that gets more than 800,000 searches per month according to Semrush.

As I continued to write I decided to do a search for that keyword and see who the new results were. To my surprise I found a short YouTube video ranking near the bottom of page 1. After doing some research on the video I examined its backlink profile and came to the conclusion that the site was ranking purely on the strength of pure spam.

This discovery got me thinking that perhaps YouTube, a Google owned property might be “protected” from such actions. After all the more traffic their videos receive, the more revenue they can generate through ads.

I decided to check out some other keywords to see if my theory held true in another niche. After some consideration I decided to focus on a local seo keyword, such as “city name seo”. I wanted a term that would have value and a term that would have some good search volume.

The keyword I settled on has roughly 500 searches per month for its city name “seo” and could potentially generate a few hundred more visits by ranking for other variations of this same keyword.

Lo and behold I was able to find a YouTube video ranking in the 6th position for this keyword.

Video rank

Well, if it is ranking in the top 10 and Google is attacking spammy backlinks, then this must be a squeaky clean white hat video correct?

Think again!

The video has 65 views yet it has 1700 backlinks from almost 300 domains. How does that happen? How can only 65 people viewed the video yet 1700 links been created for said video? Perhaps the links are quality, so let’s take a peek!

After checking the backlink profile on Majectic SEO most of the links are coming via blog comments. Wait a minute, blog comments can be white hat right?

Majestic Example

Of course they can but when the anchor text is either exact match or some variation of the main keyword then it screams spam. Don’t take my word for it, take a look yourself!

Link Example

Notice that this page has been spammed to death and has some unsavory keywords on the same page as the “seo” keyword. I have marked out most information since I just want to point out the facts but do not want to “out” the video in question.

I think it is pretty clear that the site is simply using YouTube as a “host” to spam and rank.

In light of how Google has handled some news sites and the press release distribution sites I find it rather interesting that they are punishing these domains in the name of “search quality” yet their very own property can be used to rank for some of these keywords using the shadiest of tactics with no ill effects?

What are your thoughts?