To be sure we’re all on the same page, you can visit our SEO Chat review of Google sitemaps here. This is mostly a recap for seasoned webmasters who constantly use the service, but worth checking out for anyone trying to decide if it is worth their time.
For those who want the quick and dirty news, here it is. Jill Lindenbaum from the Google Sitemaps crew sent SEO Chat the following list of improvements to the service. This is the basic rundown, and we’ll start talking about the improvements in a bit more depth on the following pages.
- Updated interface based on user feedback
- New verification method
- Indexing snapshot, including:
- If site is in the index
- When the Googlebot last accessed the home page
- If some pages of the site are partially indexed
- If the home page is currently inaccessible
- If Googlebot has encountered a large number of errors when trying to crawl the site
- Notification of violations of the webmaster guidelines
- Re-inclusion request form
- Spam report
- New webmaster help center
This is obviously more than just a minor makeover. Some of these things are new and some were available elsewhere before, but it seems Google is working to make Sitemaps a webmaster hub. They are pulling all the resources you need to manage your Google presence under the Sitemaps umbrella.
So keep reading, and take a quick look at these new tools and the new layout.
You will probably notice differences in Google Sitemaps as soon as you log in. Instead of having separate tabs for your current sites and to add sites, Google has put everything onto the front page. It’s faster to use, and somehow Google made a page with more on it feel cleaner and less cluttered.
All the links that used to occupy the top space moved down to the bottom or to more relevant places. The “Tools” link seems like the only unnecessary thing here. It just targets the four links at the bottom and doesn’t seem to work in Opera.
Those links at the bottom, you may notice, have some additions too. “Download this table” is nothing new, as it is the same as the old “Download as .csv file.” The next link, as the name would suggest, offers more options for downloading data.
Google has also been so kind as to give fast and convenient locations for the links to their spam reporting tool and re-inclusion request form. These tools were around, but you had to find them separately before. Pulling them into the webmaster tools offered by Sitemaps was a great idea.
The spam report is pretty straight forward. Just copy the query, the URL, and the offenses. The list of offenses shows how routine it is for Google to work on getting rid of specific problems, and they also say that they will investigate every spam claim. Google is definitely on the right track with his one.
The re-inclusion request lets you select the removed site from a dropdown menu of your sitemaps. Then you check off the three boxes to say (1) I know I was wrong, (2) I will mend my ways, and (3) I know what you want me to do now. Beneath all this you can write Google an apology letter.
When inside the details for one of your sites, you’ll see a huge layout change here. The whole tab and table setup is renovated to be more navigable.
So what is all this new stuff here? Most of it is pretty familiar. Let’s take a look.
The layout obviously flopped things around a bit. There is a Diagnostic tab up front, full of our very basic stats, significant errors, and also including the nice robots.txt validator. The Statistics tab and Sitemaps tab are basically the same, but the navigation has a sidebar now instead of two headers.
The Indexing summary tells us if our site is in the Google index and when your last crawl was. Whether or not the site is in the index is easy to find out (query site:www.seochat.com/), but this makes it immediately obvious. It’s nice to know when Googlebot last stopped by to update the index, but I would be equally interested in when it is coming back. Providing an estimated time until the next crawl could give webmasters a sign of their significance in the engine’s algorithm.
Potential indexing problems tells us that SEO Chat is only partially indexed. I was at first a little confused as to what this meant. I clicked on the little question mark to the right, and Google told me:
The Google index contains two types of pages: fully indexed and partially indexed pages. A site that’s listed by its URL and appears without a cached copy and a detailed title is partially indexed. When a site is partially indexed, it’s because our robots were unable to completely review its content during a recent crawl.
It goes on to say that the site can try more optimization to be far more easily crawled, but it’s really a limitation of Googlebot at the moment. SEO Chat could remove some titles, html, and script, but it’s probably not worth the effort since the site is really pretty thoroughly indexed already.
Other things can turn up in the indexing problems, such as if your home page is currently inaccessible.
You can also see right on this first page that Google has encountered at least 10 HTTP errors, thanks to the error notifier. Considering that SEO Chat recently changed its URL generating script, this is lower than I expected.
Finally, Google gives us a few links to the new help center. This section is improved since last time I looked at Google Sitemaps.
As you may have already noticed on the few screens I showed, there is a link for everything. Practically every piece of indexing information, error message, and statistic has a link with it. Oftentimes this link will send you to a relevant page in Webmaster Help Center. Staying ahead of questions will make Sitemaps better for the less savvy site owner or SEO.
Besides all the handy topical links, Sitemaps also has a whole help directory that you can find here: http://www.google.com/support/webmasters.
The help center is now more detailed. It has not just the help you might need in Google Sitemaps, but also answers to general webmaster questions, third party scripts that go with Sitemaps (such as a plug-in for vBulletin among many other things), and information on Googlebot.
Along with the new support, they also provide a new way to verify sites. You can still place the randomly generated HTML file in you site’s root directory like before, but now they also permit using a special meta tag for verification. You can find more information here.
I definitely like the new layout and better help support. In an internet that is increasingly dependent upon Google rank, it makes a webmaster’s job a little easier. There are a few things that would still be nice to see down the line for sitemaps.
With the indexing stats, it would be very nice to see a site’s crawling frequency. Watching the date that Google last crawled the site can give us a clue over time, but I’d much rather see an estimated time for he the next Googlebot crawl and a crawl priority. That would provide more of an idea of how much fresh content Google is identifying, and webmasters can watch Google’s crawling preferences changing.
Having more query stats would also be generous of Google. Like I mentioned a month ago, I can see what terms people use to find SEO Chat, but I can’t tell what pages are the most successful. Having a way to indicate searchers’ clickthrough ratio of the top search queries the site appears for would have search advertisers singing praise of Google. That would make all our jobs easier; it would encourage people to make helpful SERPs rather than high-ranking ones. It would also let us compare the clickthrough for organic listings and PPC ones. Valuable comparisons between organic and PPC, of course, may also be a reason why Google would not be excited about this feature.
Overall, though, Google has made their service a helpful tool for any site manager. Like I said at the end of the last article on Sitemaps, if you don’t already use it, you should.