Four SEO Tips to Help You Think Like Google

Put the term “SEO tips” into Google’s search box and the leading search engine helpfully returns 12.8 million hits. That’s way too many for one person to digest. If you’re looking for a smaller assortment of tricks for getting to the top of the search engine results pages, keep reading.

Before I go any further, I just want to make it clear that I make no claims to comprehensiveness. And I don’t practice SEO myself; I read about it a lot, and as a writer and editor I think (unconsciously) about SEO whenever I write and/or title an article. That disclaimer out of the way, I’ve found that a lot of SEO seems to boil down to common sense, once you understand how search engines “think.” In the tips that follow, which I’ve culled from around the web, I’m going to try to focus on the ones that seem to highlight this aspect of SEO.

Speak Google’s Language

Google knows all sorts of languages, from spoken to written to programming. It’s been said that Google can crawl any language; it can even crawl Flash and JavaScript these days. Well, that’s not exactly true. Just because Google can crawl your page doesn’t mean that you’re getting the most out of it that you can.

A blog entry on 1st Search Engine Rankings illustrates this point. The author talks about starting a forum which was set up with dynamic PHP URLs. If you have ever wondered what a dynamic URL is, it usually features lots of “?” and “=” in part because the page is being constructed – that is, certain elements are drawn from a database. There’s nothing wrong with that, and a lot of good reasons to have a web site set up with a database in this way. But it’s not search engine friendly; it isn’t immediately obvious from the URL what your page is about.

The blog entry goes on to say that they set up a modification which turned the URLs into static-looking HTML. URLs for threads would say something like “google-discussion.htm” instead of “showthread.php?t=12345678.” After they installed the mod, they began to “see an increase in both indexed pages and searches which ended in a forum page result.” So if you want to see your traffic increase, set up your URLs so that search engines (and site visitors) are reading real words instead of gibberish.

SEO consultant Rob Sullivan (not to be confused with Danny Sullivan) noted that Google’s Sitemap program has some features that make it more worthwhile than you might think at first glance. Maybe you aren’t using it yet because you figure it’s too much trouble or that Google will find and crawl your site anyway. If you are using it, maybe you figure that the main benefit is that you can feed your content to Google rather than waiting for the googlebot to find it on its own.

Using Sitemap can be particularly useful if you have some content that a search engine can’t navigate past, possibly involving flash, AJAX, or JavaScript (remember, these days it isn’t just Google using Google Sitemaps; MSN, Yahoo, and Ask all use the same protocol for Sitemaps now). This is a way to help ensure that it gets indexed.

As Rob Sullivan points out, though, there’s even more going on here. Google has added features to the program. You need to go through Google’s verification process and make sure your 404 errors are configured correctly, but once you do, you can get more information about your site. For example, the “Top Search Queries” feature tells you which keywords were typed into the search engine when your pages appeared in the search results, regardless of whether those pages were clicked on. By seeing what terms your site turned up for, but did not inspire click-throughs, you can consider what adjustments you need to make to appear higher in the SERPs and get more traffic.

Incidentally, the Google Sitemap program will also tell you what terms users put into the search box before clicking on your URL – so you can see which key words were actually successful in driving traffic to your site. You can also check out “Crawl Stats” for a bot’s-eye view of how Google sees your site; you’ll even see any errors generated by your web site.

If you want to take advantage of Sitemaps by creating one, there are a few things to keep in mind. First, they’re XML files. Second, there is a limit of 50,000 URLs and 10 MB per sitemap. You can use multiple sitemap files; you’ll need to include a Sitemap index file to serve as an entry point for a total of 1000 sitemaps. Sitemaps can be compressed using gzip, which reduces bandwidth consumption. For more information about the Sitemap protocol, check out the Sitemap website.

You’ll find some lively debate on this topic in our SEO Chat forums. A recent post from Google’s Webmaster Central Blog seems to indicate that meta descriptions still have their purpose. They may not get you to the top of Google, but they frequently turn up as a description of your site in the SERPs. In other words, they’re seen by web surfers who have just performed a search and are deciding which link to click. Don’t you want to give them more incentive to click yours?

The meta description goes into HTML code for each page in your web site, and it looks like this:

META NAME=”Description” CONTENT=”informative description here”

According to Raj Krishnan of Google’s Snippets Team, Google actually does care about meta descriptions because they “want snippets to accurately represent the web result.” When the descriptions give users a clear idea of the page’s content, Google prefers to use the meta description. Keep in mind that a good meta description is not “comprised of long strings of keywords.” It won’t affect your ranking within the search results either – but it could affect your click-through, and that’s what really matters.

A good meta description tells the reader what that page is about, or what they can do on that page. For example, Google Video’s meta description says “Search and browse all kinds of videos, hosted on sites all over the web, including Google, YouTube, MySpace, MetaCafe, GoFish, Vimeo, Biku, and Yahoo Video.” Make sure you write a different meta description for every page on your web site. Do you have a web site with too many pages to do that? Then prioritize which ones get a meta description rather than creating boilerplate that will do for all of them; Krishnan notes that Google is less likely to display boilerplate descriptions. He says that site owners should “at the very least, create a description for the critical URLs like your homepage and popular pages.”

A well-constructed meta description doesn’t have to be in sentence format. It can simply be structured in a way that tags all the information related to the page. Even there, though, you should avoid duplication and keyword stuffing. Krishnan used the example of a meta description for the seventh Harry Potter book, taken from a product aggregator. It mentioned the book’s title (which had already been used in the title tag for that page), and used the author’s and illustrator’s names twice without identifying who they were. Krishnan then showed a better version:

META NAME="Description" CONTENT="Author: J. K. Rowling, Illustrator: Mary GrandPré, Category: Books, Price: $17.99, Length: 784 pages"

Why is this better? “No duplication, more information, and everything is clearly tagged and separated,” Krishnan explained. This kind of description can be useful for site pages that list products; you can put everything important in one place, rather than have it scattered all over the page, hard for Google to find. You’ll find more information about meta descriptions at the actual blog entry.

It’s been true since the beginning of the web, well before Google came into existence: content is king online. It’s very important that you publish lots of good content: make it fresh, make it frequent, and make it attractive to your site visitors. And that’s true now more than ever, thanks to some of the things Google has been doing lately.

It’s always been possible to search by time when doing special searches on Google for blog posts and news items. And you can also search by time in Google when you click on advanced options. But now it appears that the search engine giant is making this option more prominent.

Loren Baker over at Search Engine Journal reported that he received a screen shot showing a normal Google web search – no special features – with a drop-down menu letting users choose posts from any time, the past 24 hours, past week, past month, past 2 months, past 3 months, past 6 months, and past year. He believes that Google is testing out this potentially new aspect of its interface because it is “consolidating its blog, news and other search channels into one Universal search,” so more users will want to search by day or time as a regular thing.

Using these tips will help you with your site’s SEO campaign and climb to the top of the SERPs. Good luck!

Google+ Comments

Google+ Comments