Before I go any further, I just want to make it clear that I make no claims to comprehensiveness. And I donâ€™t practice SEO myself; I read about it a lot, and as a writer and editor I think (unconsciously) about SEO whenever I write and/or title an article. That disclaimer out of the way, Iâ€™ve found that a lot of SEO seems to boil down to common sense, once you understand how search engines â€śthink.â€ť In the tips that follow, which Iâ€™ve culled from around the web, Iâ€™m going to try to focus on the ones that seem to highlight this aspect of SEO.
Speak Googleâ€™s Language
A blog entry on 1st Search Engine Rankings illustrates this point. The author talks about starting a forum which was set up with dynamic PHP URLs. If you have ever wondered what a dynamic URL is, it usually features lots of â€ś?â€ť and â€ś=â€ť in part because the page is being constructed â€“ that is, certain elements are drawn from a database. Thereâ€™s nothing wrong with that, and a lot of good reasons to have a web site set up with a database in this way. But itâ€™s not search engine friendly; it isnâ€™t immediately obvious from the URL what your page is about.
The blog entry goes on to say that they set up a modification which turned the URLs into static-looking HTML. URLs for threads would say something like â€śgoogle-discussion.htmâ€ť instead of â€śshowthread.php?t=12345678.â€ť After they installed the mod, they began to â€śsee an increase in both indexed pages and searches which ended in a forum page result.â€ť So if you want to see your traffic increase, set up your URLs so that search engines (and site visitors) are reading real words instead of gibberish.
SEO consultant Rob Sullivan (not to be confused with Danny Sullivan) noted that Googleâ€™s Sitemap program has some features that make it more worthwhile than you might think at first glance. Maybe you arenâ€™t using it yet because you figure itâ€™s too much trouble or that Google will find and crawl your site anyway. If you are using it, maybe you figure that the main benefit is that you can feed your content to Google rather than waiting for the googlebot to find it on its own.
As Rob Sullivan points out, though, thereâ€™s even more going on here. Google has added features to the program. You need to go through Googleâ€™s verification process and make sure your 404 errors are configured correctly, but once you do, you can get more information about your site. For example, the â€śTop Search Queriesâ€ť feature tells you which keywords were typed into the search engine when your pages appeared in the search results, regardless of whether those pages were clicked on. By seeing what terms your site turned up for, but did not inspire click-throughs, you can consider what adjustments you need to make to appear higher in the SERPs and get more traffic.
Incidentally, the Google Sitemap program will also tell you what terms users put into the search box before clicking on your URL â€“ so you can see which key words were actually successful in driving traffic to your site. You can also check out â€śCrawl Statsâ€ť for a bot’s-eye view of how Google sees your site; youâ€™ll even see any errors generated by your web site.
If you want to take advantage of Sitemaps by creating one, there are a few things to keep in mind. First, theyâ€™re XML files. Second, there is a limit of 50,000 URLs and 10 MB per sitemap. You can use multiple sitemap files; youâ€™ll need to include a Sitemap index file to serve as an entry point for a total of 1000 sitemaps. Sitemaps can be compressed using gzip, which reduces bandwidth consumption. For more information about the Sitemap protocol, check out the Sitemap website.
Youâ€™ll find some lively debate on this topic in our SEO Chat forums. A recent post from Googleâ€™s Webmaster Central Blog seems to indicate that meta descriptions still have their purpose. They may not get you to the top of Google, but they frequently turn up as a description of your site in the SERPs. In other words, theyâ€™re seen by web surfers who have just performed a search and are deciding which link to click. Donâ€™t you want to give them more incentive to click yours?
The meta description goes into HTML code for each page in your web site, and it looks like this:
META NAME=â€ťDescriptionâ€ť CONTENT=â€ťinformative description hereâ€ť
According to Raj Krishnan of Googleâ€™s Snippets Team, Google actually does care about meta descriptions because they â€śwant snippets to accurately represent the web result.â€ť When the descriptions give users a clear idea of the pageâ€™s content, Google prefers to use the meta description. Keep in mind that a good meta description is not â€ścomprised of long strings of keywords.â€ť It wonâ€™t affect your ranking within the search results either â€“ but it could affect your click-through, and thatâ€™s what really matters.
A good meta description tells the reader what that page is about, or what they can do on that page. For example, Google Videoâ€™s meta description says â€śSearch and browse all kinds of videos, hosted on sites all over the web, including Google, YouTube, MySpace, MetaCafe, GoFish, Vimeo, Biku, and Yahoo Video.â€ť Make sure you write a different meta description for every page on your web site. Do you have a web site with too many pages to do that? Then prioritize which ones get a meta description rather than creating boilerplate that will do for all of them; Krishnan notes that Google is less likely to display boilerplate descriptions. He says that site owners should â€śat the very least, create a description for the critical URLs like your homepage and popular pages.â€ť
A well-constructed meta description doesnâ€™t have to be in sentence format. It can simply be structured in a way that tags all the information related to the page. Even there, though, you should avoid duplication and keyword stuffing. Krishnan used the example of a meta description for the seventh Harry Potter book, taken from a product aggregator. It mentioned the bookâ€™s title (which had already been used in the title tag for that page), and used the authorâ€™s and illustratorâ€™s names twice without identifying who they were. Krishnan then showed a better version:
META NAME="Description" CONTENT="Author: J. K. Rowling, Illustrator: Mary GrandPrĂ©, Category: Books, Price: $17.99, Length: 784 pages"
Why is this better? â€śNo duplication, more information, and everything is clearly tagged and separated,â€ť Krishnan explained. This kind of description can be useful for site pages that list products; you can put everything important in one place, rather than have it scattered all over the page, hard for Google to find. Youâ€™ll find more information about meta descriptions at the actual blog entry.
Itâ€™s been true since the beginning of the web, well before Google came into existence: content is king online. Itâ€™s very important that you publish lots of good content: make it fresh, make it frequent, and make it attractive to your site visitors. And thatâ€™s true now more than ever, thanks to some of the things Google has been doing lately.
Itâ€™s always been possible to search by time when doing special searches on Google for blog posts and news items. And you can also search by time in Google when you click on advanced options. But now it appears that the search engine giant is making this option more prominent.
Loren Baker over at Search Engine Journal reported that he received a screen shot showing a normal Google web search â€“ no special features â€“ with a drop-down menu letting users choose posts from any time, the past 24 hours, past week, past month, past 2 months, past 3 months, past 6 months, and past year. He believes that Google is testing out this potentially new aspect of its interface because it is â€śconsolidating its blog, news and other search channels into one Universal search,â€ť so more users will want to search by day or time as a regular thing.
Using these tips will help you with your siteâ€™s SEO campaign and climb to the top of the SERPs. Good luck!