Is 2016 The Year Bing Takes On Google?

Bing Gains Ground As Google Loses Browser Contracts

Bing has reached a record-breaking 21% share of the search engine market. Find out what this means in relation to the current market leaders, Google.

Imagine a world where we ‘Bing it’ to get the answer to a burning question rather than ‘Googling it’. 2016 could well be the year that sees this become a reality.

The Rise Of Bing

Bing, the search engine owned by Microsoft, is rapidly gaining ground on Google, who have been the dominant search engine giant for more than a decade.

Bing hasn’t always been a success story. In recent history, the Microsoft search engine was costing the company roughly $1 billion per quarter. There were calls by many to ditch the search engine market altogether. However, Microsoft’s CEO Satya Nadella saw something in Bing that others did not and committed to investing in the service. His instinct has paid off and Bing is now making a profit of approximately $1 billion per quarter.

It is likely that the increase in profitability has come from the mass release and rollout of Windows 10 and Surface devices, both of which use Bing as their default search engine. Similarly the Windows 10 Mobile and Windows Phone operating systems are also pre-installed to have search results powered by Bing.

The Proof Is In The Stats

The search engine market share statistics speak for themselves. Bing has now reached a 21% share of the search engine market for the first time, whilst Google hangs on to a massive 64%. Google is of course still comfortably in the lead, but the speed at which Bing is increasing their grip on the market is likely to make them uneasy given that search results are Google’s core business.

An honourable mention should also be given to Yahoo who are hanging on to a market share of approximately 12%.

Companies Ditch Google

In January of this year, an enormous ten year contract begins between AOL and Microsoft, which sees AOL introduce Bing as their default search engine. AOL, which is now owned by Verizon had previously been using Google for their search results. Unfortunately, a comparable story occurred last year when Google also lost their contract with Mozilla, who chose to use Yahoo as the default engine in its browser. This was huge news at the time as it represented Google’s largest loss in search engine market share since 2009

Safari Traffic

In addition, there are rumours that Apple Safari, who currently use Google on their iPhone devices, might also be considering going with Yahoo, Bing or even their own in-house program. This would be a significant blow to Google, as reportedly more than half of mobile traffic in the USA comes from Safari, according to data collected in December 2014.

If Google continues on this downward spiral at the same time that Bing are on the up, then it’s only a matter of time before the two shall meet in the middle. However, when it comes to Google, it’s safe to say that they like a challenge and aren’t scared to shake things up. However, the same question was asked last year, and the change was not as dramatic as expected. Watch this space to find out what developments the current search engine market share leader will come up with to hang onto the crown in 2016.

Google’s Quality Rating Guidelines: What Stood Out for You?

A few weeks ago, Google has published their official Search Result Quality Rating Guidelines instructing their human raters on how to evaluate Google SERPs.

The guidelines provide lots of insight into how Google defines quality and what they mean their algorithm to understand.

I asked fellow marketers and bloggers to provide their main take-away from the guidelines and here are the answers:

Quality is Equivalent to the Average User's Judgment of Quality

Phil Turner

My main takeaway is that Google is looking for pages that help searchers, exactly as it has always said. Quality is basically equivalent to the average user's judgment of quality.

Yes, that is still vague, but we all know a low quality site when we see one. Similarly we all know a high quality one. We might differ in the details, but if we are talking about general perception I think most people will agree.

Keep Your Content Fresh

David Trounce

Google's Quality Rating Guidelines are a reminder for small business, especially e-commerce, to keep their content fresh. The Guidelines give special attention to freshness as a measure of its "High Needs Met" (HNM) ratings.

Page 141 of the report tells us,

"For these queries, pages about past events, old product models and prices, outdated information, etc. are not helpful. They should be considered “stale” and given low Needs Met ratings. In some cases, stale results are useless and should be rated FailsM."

If you are providing product information, make sure it it is well maintained with current data. This should include a review of the on-page SEO factors such as buzz keywords and relevant trends. You can also add value and improve your score in this area by adding fresh content surrounding product updates and new releases by a well maintained blog on your site.

For E-A-T (Expertise, Authority, Trustworthiness) websites, (which might include technology blogs or tutorial sites, for example), the freshness scale is less important since a fair amount of content in this field does not change (Think software tutorials or a first aid procedure). But business should still take advantage of the freshness factor and aim for a High Needs Met Rating by updating, improving and adding value to existing, static content from time to time.

YMYL (Your Money or Your Life) Sites Are Held to Higher Standards

Tom Shivers

According to the guidelines, Evaluators hold YMYL sites to higher standards.

YMYL is short for Your Money or Your Life sites and include medical, financial, health, safety, sites that require personal identification, provide advice on major life issues, even ecommerce sites that sell expensive products or products that have major life implications:

  • Online banking pages
  • Pages that provide info on investments, taxes, retirement planning, home purchase, buying insurance, etc.
  • Pages that provide info on health, drugs, diseases, mental health, nutrition, etc.
  • Pages that provide legal advice on divorce, child custody, creating a will, becoming a citizen, etc.
  • There are many other web pages that fall under YMYL at the discretion of the Evaluator.

When an Evaluator identifies a YMYL site, they will research its reputation:

  • Does the site have a satisfying amount of high quality main content?
  • Does the site have a high level of authoritativeness, expertise or trustworthiness?
  • Does the site have a positive reputation?
  • Does the site have helpful supplementary content?
  • Does the site have a functional page design?
  • Is the site a well-cared for and maintained website?

YMYL sites must convince Google Evaluators that they possess a healthy level of credibility and online reputation.

Google Strives to Identify "Main Content" on a Web Page

Casey Markee

I thought one of the big takeaways for me was Google's emphasis on "main content." Google was clear in instructing raters that they should be on the lookout for, and actively encouraged to, downgrade pages that have a hard time distinguishing main content from ads or other distractions on the page.

To me this is all about user experience and Google's continual desire to make sure their index provides preference to site pages that have a clear separation between advertising and content. Quality raters are encouraged to provide a less than helpful rating on pages where the lines between this separation is blurred. And that, to me, provides a great benefit to users.

Google Does Rely on Humans for Algorithm Evaluation

David Waterman (SEO Rock Star)

Having worked in the SEO industry for over 10 years, the release of the latest Google Quality Rating Guidelines is yet another reminder that Google doesn't rely 100% on bots and algorithms to determine quality online content.

It layers on a human component to ensure the results Google provides are quality and match the true intent of the search query.

Make Your Site Mobile-Friendly

Graeme Watt

The biggest takeaway for me was to make your site mobile friendly if it isn’t already. A large proportion of the guidelines was focused around mobile and it is clear Google now views this as a sign of a quality website.

If this is the case, it means that anyone producing amazing content on a site which is not mobile friendly is going to be viewed as low quality. This should be avoided at all costs. 

Google Wants to "Think" Like Human Beings Do

Louie Luc

"Quality" and "relevancy".
It just couldn't be simpler than that.

That's what users are searching for when they use a search engine like Google. That's what Google wants to offer its users.

Google aims at thinking more and more like a human being so that it may "understand", "feel" and "see" what a user understands, feels and sees when he / she visits a website suggested by a Google search.

And what are people looking for? Quality relevant sites or web pages.

Put Your Users First

Doyan Wilfred

Put your users first and foremost.

  1. Write high-quality, in-depth, well-researched articles.
  2. Write for users. Optimize for search engines.
  3. Provide helpful navigation-think breadcrumbs.
  4. Invest in clutter-free, User-friendly, mobile-friendly design.
  5. Display your address and contact information clearly.
  6. Create and maintain a positive reputation. Content won't save you if you send hitmen after your customers (true story!).

Expert Content will be Rewarded Irrelevant of the Domain Authority

Cormac Reynolds

From what I can gather, one of the main takeaways is that we're coming increasingly closer to a point where quality, expert content will be rewarded irrelevant of the domain authority of a website.

It seems the algo is coming increasingly intelligent and capable of determining the best content, so those that put the effort in sharing details and info will be rewarded. Personally, we're probably still a while away from this as an absolute, but from the look of the guidelines things are going that way.

The Fundamental Principles Are The Same: Provide Quality, Put the User First…

Tim Felmingham

There's really nothing new here, it's very similar to the guidelines leaked (supposedly unofficially!) in 2008, and a few times since. The overall message is the same as it always was – you need to build sites with original, quality content that provides real value to the searcher.

They have defined a quantitative process for assessing this, including Expertise, Authority, and Trustworthiness, and how well it meets the searcher's needs. The process is interesting, but not revolutionary, it's simply a formal definition of what we all understood anyway

Many people will flock to this document, in the hope it will give some insights into how to 'game' the system, which of course it won't! Although the general principles of the guidelines will be familiar to anybody involved in SEO, it's still well worth a read, just to make sure there aren't any key areas you have missed in your own site. It will show you how to view your site through the eyes of a Google rater, and more importantly, through the eyes of a user.

The Emphasis is on the Quality

Nashaat

It's clear that Google prefers information posted by a human rather than machine generated information to evaluate quality. They also place more emphasis on relevant indicators such as time spent on the website etc. and customer reviews. Again, the emphasis here is on content of the reviews and not the number of reviews.

And what’s your main take-away?

How to Use Social Media to Boost SEO

In his Entrepreneur post “Good Social Media Boosts SEO,” Larry Alton admits that we do not know how social media affects SEO because we don’t know how much emphasis Google places on social interactions and signals. Alton writes,

“Here’s what we do know: Google pays attention to social interactions such as likes, retweets, shares, and even +1’s.”


I like his analogy that Google is an “independent filing system” for a continually ” expanding library of virtual content” online. He argues that Twitter gets our content indexed more quickly and social search measures performance.

More importantly, all social networks are indirect link-building mechanisms. The more your content is shared across social media, the more likely a content creator will find it and decide to link to it.

Targeting the Right Audience

Just getting content shared is not enough; the point is to get content to your target audience. Targeting the right audience is just as important in social media as it is in SEO. Start with this tip from Capital Merchant’s “Tips to Improve Your Social Media”:

The most important thing is to define your audience. Think of your ideal customers and write down any and all information you can think of about them. Be specific with regard to gender, race, age, and profession. The more info you can think of, the better. Now expand it a bit — you want to have three to four different target audiences that will get your messaging at different times, letting you cast a wider net and still be able to figure out what messaging resonates with what customer.


Nothing is more important than reaching potential customers and clients, rather than wasting marketing money on people who aren’t interested in what you have to offer. It is not just a matter of getting impressions. The point is to get interactions: favorites, retweets, reshares, likes, +1s, stumbles, and especially clicks.

Those interactions are likely to also be important to SEO value. Mentions on social media improve brand recognition, increasing clicks during searches. The tighter your focus on the correct audience, the stronger your results will be.

SEO Specific Social Media Strategies

In his Forbes post “6 Social Media Practices That Boost SEO,” Jayson DeMers argues there are six social media strategies that definitely improve SEO. His first strategy relates to growing the number and quality of your followers.

Our social media profiles have their own trust flow and page authority, which is affected by the quality of sites linking to them. This is easily seen by comparing Twitter accounts for writers who contribute on major sites to non-writers.

Look at @Kikolani, @GrowMap or @SEOsmarty and compare them to the average Twitter account. This holds true on all social networks. They are authority sites, and although the links are typically nofollowed, they are still considered valuable as indicators of trust.

According to “How Valuable Is That Link?,” a post gleaned from a forum discussion with SEOchat member PhilipSEO:

“Some observers think that a nofollow link from a trusted site still passes along some kind of SEO value, even if it doesn’t pass any link juice. Phil states that ‘Many have reported and speculated, for example, that nofollow links from Wikipedia and similar high-trust sites can provide a great boost to rankings in spite of nofollow.’”


It seems safe to assume that social media accounts can develop higher trust rankings dependent upon where the links originate. This could include which other authority sites mention them or reshare their content. If those links are dofollow, the Page Authority (PA) of those social media profiles will be higher.

How to Use Social Media to Boost SEO

Does it then follow that shares from higher trust and PA social media accounts are more valuable for SEO purposes than shares from ordinary accounts? I would say, yes. It is possible to use social media to boost SEO.

SEOGenius Director Bruce Smeaton drills down even further:

“I think the key is in understanding what ‘more valuable for SEO purposes’ really means.” It stands to reason that a highly trusted social channel is, by default, going to generate more shares than a social media property with lower authority. And by doing so, the likelihood of these shares turning into links increases accordingly. This in turn results in higher traffic volumes being directed to your site… and this is where the magic starts. No, it doesn’t matter whether the links are “follow” or “nofollow.” What really counts here is that the traffic generated indirectly via social media interaction must impact positively on your site’s perceived value and authority. To quote the words of Search Engine Journal’s Dario Zadro, “What is more likely happening is that Google is recognizing these social signals as ‘brand signals,’ which they love.”


If higher trust and PA matter, it is highly likely that relevance does as well. There are tools like Klout and Kred that measure influence. Kred is of particular interest because it measures influence by topic.

No doubt Google algorithms can measure what is most relevant to any particular social media account. By identifying which social media users are most influential on any particular topic, they are also most likely to be valuable for SEO purposes.

SEOs need to be thinking about how identifying and using influencers will impact their SEO rankings. Those who do not will be wondering why competitors are outranking them without having any idea why.