If you’re familiar with many of Google’s features, you know that Fetch as Googlebot has been around for nearly two years. Webmasters can use it through Google Webmaster Tools. Tell Fetch as Googlebot to crawl a specific URL on a site you’ve verified, and you’ll see your page the way Google would see it. This is great for figuring out and debugging website issues that don’t show up if you’re simply looking at the site through a web browser.
Now, though, Fetch as Googlebot lets you take things one step further. If it successfully fetches your URL, you can request that Google index it by clicking a “Submit to index” link. You won’t want to do this with every page on your website, as there are important limits and points to consider, but it’s great to have this additional option.
One important point to keep in mind is that you can only submit 50 pages a week in this fashion. You can submit all pages linked from the URL you’re submitting you’re submitting as well, but you can only do that 10 times in a month. Additionally, Google notes that submissions of images or video are more appropriately made by using Sitemaps rather than Fetch as Googlebot.
You should also keep in mind that submitting a URL in this way does not mean that Google will definitely put the page in its index. In its blog post on the topic, Google states “that we don’t guarantee that every URL submitted in this way will be indexed; we’ll still use our regular processes – the same ones we use on URLs discovered in any other way – to evaluate whether a URL belongs in our index.”
On the other hand, using Fetch as Googlebot does speed up the process of crawling your URLs. Google will do that within a day of your submission. It’s worth remembering that not everything Google crawls goes into its index. And not every URL that Google learns about gets crawled right away, in any case. As Vanessa Fox explained on Search Engine Land, it arranges the list of URLs it discovers in priority order before crawling them. Google uses a number of factors to determine the priority of a web page, including the page’s overall value, PageRank, frequency of change and how important Google thinks the page is (this is why news home pages tend to get crawled and indexed very quickly).
If you want to take advantage of this new way to submit URLs from your website to Google for crawling and possible inclusion in its index, it’s best to use it for new areas and categories on your site. If you’re making a major update, Fetch at Googlebot will also serve your need to get Google’s attention quickly. This will help you to speed up URL removal or cache updates as well.
You can also use Fetch as Googlebot, to a limited degree, as part of a campaign to promote your website and raise awareness of your business. Google notes that “You can also submit URLs that are already indexed in order to refresh them, say if you’ve updated some key content for the event you’re hosting this weekend and want to make sure we see it in time.” You can use this to make visitors happy, or avoid having them get angry.
For example, I’ve attended a number of weekend-long events that involved a great deal of planning, along with an associated website (or at least a section of a website). These events often included several tracks of programming, but with so much going on involving many people (often volunteers), the actual schedule didn’t gel until a week or less before the event. Using Fetch as Googlebot once the schedule has been turned into HTML pages and/or PDF documents to make Google aware of the pages can help…especially if you have a few last-minute attractive additions, like someone with a large fan following.
How you choose to use Fetch as Googlebot is up to you; for most purposes, you should probably continue to use Sitemaps. But Fetch as Googlebot sounds like just the thing to use if and when you’re getting ready to expand or extensively revise your website, and you want Google to know about the process as quickly as possible. Good luck!