I’m going to start by talking about one version of site accessibility. Kalena Jordan, writing for Site Pro News, mentioned two scenarios that I sincerely hope we don’t see too often today. The first one involves taking a site offline for maintenance. It’s true that eBay does regular maintenance on its site every week, on Friday, but that fact is very well-known to the regulars and easy to find out if you’re new. Typically, during a regular maintenance period, eBay’s features are intermittently unavailable or slow. eBay always announces scheduled maintenance well in advance and states how long it will last.
eBay does not put up a “back in an hour sign” on its site while it performs its maintenance, and neither should you. Not only will users get annoyed, but what happens if a search engine spiders your site during this time? It won’t recognize the sign; it will simply assume your other pages have expired and remove them from the index. If that makes you shiver, it should. All of your hard SEO work to get to the top of the rankings will go down the drain until the search engine spiders come back. Consider setting up a mirror site for maintenance periods.
Another situation Jordan mentioned involved one of her now ex-clients, who took her site down for three whole weeks during her Christmas vacation, without telling Jordan. She’d just spent a month getting good rankings for the site, too, which all went down the drain. A web site is not like a brick-and-mortar business. Even if you have no one there to take phone calls, the site can act as a sort of “information stand-in,” letting people know what you have to offer.
You can certainly put a note on the site explaining that the principals are on vacation and the business is temporarily closed, mentioning when you’ll be back – but by all means leave the rest of the site up. It will give your site a chance to climb in the SERPs, and rather than abandoning your site completely, visitors might consider what they’d like to purchase when you get back. It’s better still, of course, if you set up your site so that they can still place orders, with the understanding that you’ll fill them as soon as you can when you get back.
Okay, if you don’t do your own SEO, I sincerely hope you know better than to hire the SEO company that just sent you an e-mail out of the blue saying that they can drastically improve your position in the SERPs. That is spam. No, you can’t buy hundreds or thousands of good links for $50; no, you can’t believe them when they say they can quickly get you a top position in the SERPs for whatever keywords you want.
Make no mistake; good backlinks are worth their weight in gold. But the good ones take real time and effort to generate. The ones that you’ll be getting for your $50 or $100 will be unrelated to your content, and therefore not counted as particularly valuable by the search engines. Besides, if your site suddenly gets tons of backlinks, that’s going to look very suspicious to the search engines, and could result in a penalty.
Likewise, submitting your URL to search engines is a waste of time. They’ll come and index it soon enough; sooner than you might think, in fact, if you get some good links from other sites for the spiders to follow. That’s assuming you’ve set up your site so that they can crawl it (more on that in a bit).
Naturally, if you shouldn’t waste time submitting your URL to the search engines, you also shouldn’t waste money on having someone else submit your web site to the search engines. Don’t do this even if they say that they’ll submit it to thousands of search engines and directories. That strategy might have worked five or more years ago, but it certainly doesn’t work today. The problem is, most of the directories you’re likely to get your site submitted to this way are link farms, and those do get penalized by Google. The only directories which I have consistently heard might still be worth submitting to are DMOZ and Yahoo’s directory.
What about getting a link to your site from Wikipedia? The vast majority of articles in that volunteer-edited online encyclopedia do link to relevant web sites. However, Wikipedia uses rel=”nofollow” on those links, which means that they don’t get crawled by the search engines. On the other hand, those who read the article might choose to click on one of the reference links to find out more information, and a rel=”nofollow” certainly doesn’t prevent that. Just remember that if your link isn’t truly relevant, it can be removed by one of Wikipedia’s editors.
Some of the issues surrounding content also deal with other areas of concern. For example, you want to make sure your website looks fresh and that you address your customer’s concerns, right? Apparently not every site owner worries about that, though they should. Jordan mentioned that she was once given the responsibility to rewrite the web copy of a large real estate firm, when she was working as a PR consultant. The firm had pages where customers could leave feedback. Imagine her surprise when she saw that they had let some extremely negative feedback from one of their customers sit on their feedback pages for at least a year! Somebody should have noticed that content and addressed the issues it raised.
If you’re a regular reader of our SEO Chat newsletter, you probably saw in a recent Spotlight that Google is now treating domains and subdomains a little differently. What this means is that, if SEO considerations dominated whether you split up your content among several subdomains (i.e. trying to get as many spots in the SERPs as possible) you need to change your focus. Instead, look at it from your visitors’ point of view. Does it make sense based on the kind of content to split it up among several domains or subdomains?
Your content is worthless if visitors can’t find it. If they can’t find you in the search engines, that reduces their chances of being able to find your content. And they won’t be able to find you in the search engines if the spiders can’t crawl your site. Back in July, Search Engine Land ran an article that listed some of the reasons a site may be uncrawlable. These include:
- An incorrect robots.txt file.
- Using session IDs or too many variables in your URLs.
- A convoluted navigation menu.
- Too much use of Flash, graphics, or AJAX.
Another mistake many sites make is not having enough real content. Maybe you think your customers don’t want content. Or maybe you think your field or the way you have your web site set up doesn’t lend itself to producing good content. Recognize these for what they are: excuses. Web surfers go on the web looking for content. If you want them to visit your site and link to you – thus increasing your traffic and possible conversions – you need to offer them some value. That means content.
Can’t think of any way to add content to your site off the top of your head? How about a glossary of terms for your field? Perhaps a frequently asked questions page makes sense. You might even be able to come up with some how to articles. The list can go on and on, if you get a little creative.
The misuse of keywords could almost fill a chapter in a book of SEO mistakes all by itself. This can cover a lot of ground, really, because there are so many creative ways you can hurt yourself with these things. From the reading I’ve done on the subject, keywords should be treated like the spices in a mild curry — use enough so that they’re noticeable, but not so much that the flavor integrity of the whole dish gets drowned out.
There are a number of places you should not use your keywords multiple times. That is called keyword stuffing, and may be treated like spam by the search engines. These include your meta tags, image alt tags, titles and headers (write good titles and headers; don’t stuff them!), and so on. Frankly, you should just avoid keyword stuffing altogether. Likewise, please do not include the same meta description and keywords on every page! Search engines look at these when they perform duplicate content checks, so you could actually be penalized for a lack of creativity. You want to have different title elements and headers on your pages, too.
Don’t even think of coming up with devious ways to do keyword stuffing, such as hidden text (using white text on a white background so only the search engines see it). Yes, the search engines are wise to this trick, and it’s likely to get you penalized. If you think you can get away with it by using external cascading style sheets (CSS) to control the font color, think again. The search engines know about that trick too.
There are other ways to abuse keywords that may not hurt as badly, so much as make you pull your hair out. If you’ve heard of the long tail, then you know you shouldn’t be targeting overly general keywords. They’re nearly impossible to rank for. If you’re an accountant in San Francisco, you should be trying to rank for “San Francisco accountant,” not “accountant.” You could even try “San Francisco small business accounting,” if that’s your specialty. If you optimize your site for keywords that are both relevant and specific, you will receive targeted traffic that is much more likely to convert.
Finally, let’s look at content again, from the keyword perspective. That brings up keyword density. Please don’t start obsessing over getting the perfect percentage of keywords in your content. Yes, they should be in your text, but as I mentioned before, you don’t want them to overwhelm things. Please remember that you’re not writing for the search engines; you’re writing for human readers. And we get tired when we see the same word over and over again. Let your text flow naturally. If you’re writing on a topic that is relevant to your site, the keywords will fall naturally into place. Good luck!