Matt Cutts: Let Googlebot Crawl JavaScript, CSS

Last week, Google webspam guru Matt Cutts posted a short video that he described as a “public service announcement” to webmasters. In it, he asked them to stop blocking Googlebot from crawling JavaScript or CSS.

Cutts normally uses these short videos to answer questions he’s received. In this one, he acknowledged that many webmasters block Googlebot from crawling JavaScript or CSS by using robots.txt. He tried to allay concerns about Googlebot causing problems when crawling a site. “A of people block it because the think, ‘oh, this is going to be resources that I don’t want to have, you, the bandwidth [used up] or something,’” he admitted, then went on to say “but Googlebot is pretty smart about not crawling stuff too fast.”

Cutts also pointed out that some webmasters will unintentionally block Googlebot from crawling some of their JavaScript. He noted that “a lot of people will do things like check for Flash, but then they’re including some JavaScript, and they don’t realize including that JavaScript – the Javascript is blocked, and so we’re not able to crawl the site as effectively as we would like.”

So to those who have concerns about the ability of Googlebot to process JavaScript and CSS, Cutts insists that “Google is getting better at processing JavaScript. It’s getting better at things like looking at CSS to figure out what’s important on the page.”

Cutts emphasized that removing the blocks from robots.txt would help everyone: searchers, site owners, and Google. “So, if you do block Googlebot, I would ask, please take a little bit of time, go ahead and remove those blocks from the robots.txt so you can let Googlebot in, get a better idea of what’s going on with your site, get a better idea of what’s going on with your page, and then that just helps everybody…if we can find the best search results, we’ll be able to return them higher to users.”

He wrapped up his PSA by acknowledging that he’s asking for a change in behavior, but pointed out that the blocks were no longer necessary. “So thanks if you can take the chance,” he said. “I know it’s kind of a common idiom for people to just say, ‘oh I’m going to block JavaScript and CSS,’ but you don’t need to do that now. So please, in fact, actively let Googlebot crawl things like JavaScript and CSS if you can. Thanks." You can view the full YouTube video.

Users at Webmaster World have responded with a certain amount of suspicion to this video. For example, realmaverick expressed concerns that Google might misinterpret something in his JavaScript or CSS files as being malicious, and wondered if the search engine had already been penalizing sites based on their JS content. Another forum member, lucy24, noted that blocked content was none of Google’s business. Making a comparison, Sgt Kickaxe wrote “Nothing personal G, I don’t let the oil change guy take a look at my valves either.” It looks like Google can’t do anything without igniting some kind of controversy.  

[gp-comments width="770" linklove="off" ]