SEO FAQ Answered by Google Webmaster Tools

If you own a website, having an account with Google Webmaster Tools is invaluable. It can provide you with a wide variety of information about your site, its traffic, and how Google perceives it. In particular, webmasters just starting to learn SEO have many questions, and Google Webmaster Tools has the answers.

Webmasters just starting to learn search engine optimization often asked a lot of repetitive questions in SEO forums that can often be answered accurately and precisely by Google Webmaster Tools.

More often, webmasters or bloggers who do not have a Google Webmaster Tools account for their own website or blog end up asking a lot of questions or become confused when diagnosing technical website/SEO-related issues. This free tool created by Google can provide more information — more than you can imagine – and can often answer both basic and advanced frequently-asked questions (FAQ) in SEO.

This article covers some advanced and new Google Webmaster Tools features that were not discussed in a previous article about the five most important things you can do with Google Webmaster Tools. If you are ready, let’s get started.

Google has not indexed my website. Why?

You need to log in to your Google Webmaster tools account for your website. You need to confirm these following items:

Step 1: Look on the Google Dashboard front page. Did you see the message “Googlebot has successfully accessed your home page”? If yes, then your home page is indexable. Proceed to the next item to check for further troubleshooting. See sample screen shot below:

Step 2: Go to Dashboard -> Diagnostics ->? Crawl Errors and then click the “Web” tab. Under “Show URLs:” click “Restricted by robots.txt.” Does it include pages in your website that should be crawled and indexed? If yes, then you may have accidentally blocked those URLs. In addition, you can also see “unreachable,” “not followed” or “not found” for details. Some pages are not crawled because all links pointing to that page might have the rel=”nofollow” attribute.

Step 3: Did you submit an XML sitemap? If yes, you can go to Site configuration -> Sitemaps and find a column labeled“URLs Submitted.” If the number of “Indexed URLs” are substantially less than “URLs Submitted,” then you may have problems with your internal navigation menu and site crawlability. You can use the data provided in Step 2 to help you further troubleshoot crawling error-related issues.

Step 4: There are some websites with inconsistent navigation menus; for example, they have 30 links in the home page navigation menu pointing to category pages, but the category pages lack a substantial amount of links pointing to other, deep product pages, which can affect the number of pages indexed on your website. Remember that links are one of the ways used by Googlebot to find new pages on your site.

To diagnose, go to Dashboard -> Your site on the web -> Internal links, and you will see the list of URLs in your website vs the number of internal links found pointing to that page. If you have a fairly consistent navigation menu, you will observe that the number of links are about the same. If you see some pages which do not receive internal links or are not listed there at all (because there is no internal link pointing to it), you can click and examine the actual page, to see if rel=”nofollow” has been implemented on the pages. As noted earlier, that can affect crawling. Other things to check include pages with a meta noindex tag.

Go to Dashboard -> Your site on the web -> Keywords. You will see the list of words which reflect the subject matter of your website. For example, I have a website focused on PHP development, so I expect the main theme to be around PHP programming. Checking the list of keywords, this is what I see:

It says that the word “php” is the most significant term found by Google on my website. So my content is directly relevant to my targeted terms and niche. However, this is not yet conclusive, because you will still need to examine the anchor text of the backlinks pointing to your website.

Go to Dashboard -> Your site on the web -> Links to your site, and then click the “Anchor text” tab. It should show anchor text that is somewhat related to the content of your website. For example, below is a sample result:

It shows that my anchor text is consistent with my website content. So by that check point alone, you can say the website performs pretty well on relevance. But more goes into your ranking on Google than you site’s relevance; a lot of factors can come into play. One of these is how trustworthy and authoritative the domain is, the age of the domain, the site’s overall link popularity…all of these can play a role, not just relevance based on content and links. This makes ranking in the Google search engine pretty challenging for newly-established websites targeting competitive niches, because it takes months to years of hard work to earn a Google search engine ranking reputation.

For an analysis of “Top search queries,” which is also an important assessment of your content, refer to this article.

To be specific, this will answer the question “why I am not getting clicks from Google despite my good ranking?” All you need to do is compare two columns, Impressions and Clickthrough. If your most popular term in impressions is not the most popular (or number 1) under “Clickthrough,” the snippet chosen by Google from your content is not that attractive to search engine users.

This is where relevant and substantial content is important. You need it for Google to associate highly-related snippets with your site, which can improve clickthroughs. You can try searching in Google to see what snippet it shows.

Try to judge if the snippet is good enough. If not, a lot of techniques can be applied, which you can learn from Google

Does the website contain a malware hack?

Google is adding a new section in their webmaster tools, called “Labs.” In that section you can click and see “Malware details.” You will know if Googlebot found malware in your website.

Does the website load fast compared to other websites?

Again, you can look at “Site performance” which is under “Labs.” Under performance review, you will know fast in seconds (on average) your pages load. Google will then make some standard comparison with the rest of the websites in terms of percentage.

Ideally you should make your website load as fast as possible, to avoid loss of visitors due to slow website loading. This can affect sales or getting traffic to your website.

Am I cloaking or do I have hidden text in the form of hacked content?

Cloaking based on a user agent can be difficult to check (for example, if your site is hacked, and then hackers configure your website to show the hacked content ONLY to Googlebot while showing the normal content view to visitors, including you). To determine whether your website has this form of hidden text/cloaking problem, you can use “Fetch as Googlebot,” a new feature added under Dashboard -> Labs

Take the following steps:

Step 1: Type in the URL you need to analyze, or leave it blank if is the home page you are analyzing.

Step 2: Click the “Fetch” button.

Step 3: It may take some minutes; reload the page again, and if you see the “success” link, click it. What you will see is the HTML source code as fetched exactly by Googlebot user agent (not the usual browser visitors use to view your content).

Copy and paste the source code, starting with <!DOCTYPE html Public… until </html> to a notepad file and save as test.htm.

Open it in a browser. It should look exactly like the one as intended for public viewing. If you see some spam (keyword stuffing or a tremendous amount of spammy links), or something which is not seen by your usual visitors but only search engine bots, then you have been hacked. You can read more about this from Google.

Google+ Comments

Google+ Comments