“Memory is part of usability. That’s why navigation is such a pain in the neck to work out and make usable for everybody who will use your web site. There are ads, images, blinking thingy’s distracting your eyes, and your mind is going everywhere at once. Yet, when I tell a web site owner that people have trouble reading a page when there are 10 things happening all at once, they act as though I stole their blankie.” –Kimberly Krause Berg
I began writing this article in October, but ironically, I wanted to finish it on November 3rd, which is World Usability Day. I write my articles in a funny way, some might think. I write the end, then I write the middle, and lastly, I write the introduction. I didn’t know anything about World Usability Day until I ran across the concept on CNN’s Morning News program, which kicked off events in the US at Hour 21, by Elizabeth Rosenzweig, World Usability Day Chair.
“Why doesn’t this work better? Why can’t they make this easier? World Usability Day is for all the people who’ve ever asked questions like these. This Earth-Day-style event, focused on easy-to-use technology, currently involves plans in more than 70 cities in 30 countries. WorldUsabilityDay.org, online headquarters of the international day, says, “World Usability Day promotes the value of usability engineering, user-centered design, and every user’s responsibility to ask for things that work better. The Usability Professionals’ Association is doing that by encouraging, organizing, and sponsoring 36 hours of activities at the local level around the globe, all occurring on November 3, 2005.” Events included surveys on how appropriate use of color greatly impacts user experience online, seminars and conferences locally, and Open Houses.
Website usability deals with ease of navigation, how colors work together to enhance a user experience without hurting the eyes, the size of fonts, and design. But some of the concepts of website usability have a direct effect on how search engines crawl your website, and index it accordingly.
A navigation path is a sequence of pages that the visitor viewed from the moment the visitor enters the site to the moment they leave, including navigation paths taken by search engine bots.
Anchor text – Anchor text is the words used in a hyperlink to point a user to another web page or website. SEOs will tell you how important anchor text is for optimization because it is another opportunity for you to include your keywords, and tell a search engine why this link is relevant to your content. This is also important in a usability sense because it directs a visitor to the linked page while giving them an idea what the link is about. Using “Click Here” is really a waste of your SEO opportunity. What may seem to you as self-explanatory may not be as simple for your visitors. Having a button called “Knowledge Base” may not mean much to some people. Instead, if that button would simply be called “Help” it probably would assist them better.
Breadcrumbs – The breadcrumb trail provides information to users as to where they are located within the site, and it offers shortcut links for users to “jump” to previously viewed pages without using the BACK button, other navigation menus or buttons, or using a keyword search. Breadcrumbs also give search engines an easy navigation path to follow to find all the pages of your site.
The term “breadcrumb” comes from the Grimm’s fairy tale, Hansel and Gretel. Hansel left a trail of breadcrumbs through the woods so that he and Gretel could find their way back home. Since today’s internet user will have a need to navigate back through a website path, the “breadcrumb trail” was named. Using breadcrumbs is not only a good idea for your visitor; it also gives search engine spiders a path to follow for every page within the website.
Titles – Using clear, concise titles will not only inform your visitors what your site is about when they find it in the search engine results, but it also helps a search engine determine the relevancy of your site to its indexed pages. A search engine has the capability of scanning the context of a web page, and determining if the title is indeed a good match. Visitors also don’t like to click on a search engine result with a title, only to find out that the web page has nothing to do with what they were looking for, and that the title was misleading. Make sure your titles are short, directly to the point, and most importantly, an accurate assessment of the content of the page.
Directories and Levels
A colleague, Mike Shapiro, who I consider to be a friend, once told me, “Remember
the average clicker spends 7 seconds deciding whether to stick around.” Statistics show that web users like to visit sites that have what they need not more than a couple of clicks away from the home page or landing page. If a web user has to go through a dozen links in order to get to what they are looking for, they will very likely give up in frustration, and then move on to the next site offering what they need.
In the same way, many search engine bots don’t crawl more than a couple directory levels deep. If the meat and potatoes of your site is deep within the site structure, you can count on those pages not being crawled or indexed quickly, if at all. Even if you have a great navigation structure and breadcrumbs in place, it is very hard on a search engine server’s resources to have to crawl deeply into a website’s structure.
Fonts and Headings
Font size and color – Using tiny text isn’t easy for web visitors to read, and if they have to struggle to read your content, there is probably a good chance that they won’t even try. Further, many search engines give more weight to text that is bigger than text that is small. It’s also important to a web visitor that they can read the text without having to strain their eyes due to the color of the font or the background.
It’s difficult also for your web visitor to read flashing, scrolling, or blinking text. Search engines tend to disregard these things as well, so avoiding this type of font behavior is usually best.
Headings – The World Wide Web Consortium recommends using heading tags to structure an HTML document. <H1> through <H6> gives natural stopping and starting points in a web page, but also alerts a visitor to the various sections of the page. Many search engines do take these headings into account as being important in the context of the page. However, this has been abused in the past, and I don’t believe search engines give headings as much weight as they used to. Still, organizing your pages into sections is a good idea for both structural, navigational, and organization purposes, both for the site visitor and the search engines.
A clean and uncluttered design is usually a win-win situation for both your visitors and the search engines. The simplest and most cleanly coded websites are usually the ones that are visited and crawled the most, since many people know that they will find what they are looking for and where to find it; they usually are repeat visitors as well. Search engines also like to crawl sites that are not heavy on their resources. Anytime a search engine has to wade through a website in order to find the content, it taxes the search bot’s resources, and may make it spider the site less often.
Cleanly coded and compliant HTML makes for easier development for the next web designer to make changes to content. Being able to find your way around someone else’s code is important to new inductees having to look at a page for the first time. Being able to find their way around makes it easier for proofreading, editing, updating content, and fixing site issues that may prevent spidering or ranking well.
Extra tags – It’s also my belief that the <font> tag will soon be deprecated, with the popularity and more browsers conforming to CSS standards. Other tags like <b> and <i> and others are already being replaced with other tags, but in a text to code ratio analysis, all those HTML tags will affect these ratios considerably. Keeping them to a minimum is in your best interest. For example, instead of a tag for a table cell looking like this:
<td><td width=”350” height=”200”><td align=”center”>
It should be more like this:
<td width=”350” height=”200” align=”center”>
There are many situations where using an HTML editor, like FrontPage, will add in those extra tags if you make changes at a later time, whereas other editors like DreamWeaver will group them together for you. Consolidating these tags are not only more search engine friendly, but it enables you to create cleaner code for easier updating later.
CSS – CSS, or Cascading Style Sheets, is a very search engine friendly language, because many times CSS is put into external files, since search engines seem to disregard the CSS styling anyways. There really is no reason to keep CSS styles within the <head> section of your web page when you can easily reference the external style sheet in the page, and keeping your code tidy and your text to code ratios more in tact.
People generally read left to right and top to bottom. They tend to scan the first sections of the page before making a determination as to whether this page is what they are looking for or not. Similarly, a search engine scan the text in the same order: left to right, top to bottom. A search engine determines that the content at the top and left of the page is far more important than the content to the right and at the bottom of the page.
Marketing analysis shows that people tend to disregard content to the right of a page more so than they would if the content was on the left side. This has to do with the way we read. Google currently is reanalyzing its AdWords positioning, and testing other areas to put their ads for this very reason.
There are two specific reasons for having a sitemap on your website. A good sitemap allows your users find what they are looking for and it helps the search engines better index your entire site. It’s always a good idea to group your site’s pages under topics or category. If you are selling products and services on your site, or have different sections in your site, like articles, news, products, and services, it might be a good idea to place them into different groups. If you find yourself in the position of having more than 50 links on a single page, break your sitemap into several pages, grouped by category.
Taking the time to design a good site map ensures people will easily find the page or section that interests them, which is a good usability technique. Also, make sure your site map is directly linked to your homepage, and that it is a text link. This last step is important, since one of the first things a search engine robot looks at when it arrives at a website is the site map link. In this step, you allow ease of use for your visitors, and ability for easy indexing for search engines, achieving website usability and SEO in one move.
Stats to Help with Usability
Your website statistics can help you identify where your visitors are going, or tend to go, and then you can streamline your site accordingly. If your visitors get hung up on a page that has no place else to go, then navigation path statistics can help you determine where those visitors are getting stuck. You can then alter your navigation of those pages to make it easy for your visitors to get back to where they want to be.
From a marketing view, it is important to see the path the visitor takes, and which series of events followed are the most effective. Frequent exit patterns will show your where your site is underperforming. Error statistics can also help you improve the usability and navigation of your website.
There is absolutely no reason at all that your design can’t cater to both visitors and search engines. There are many techniques and tools that you can utilize to make your website more accessible and search engine friendly, such as external style sheets, clean code, or easy navigation systems. Website usability can definitely go hand in hand with good SEO, and these things I’ve outlined for you today are not at all difficult to implement. Not only will these things make it easier for a visitor to want to keep coming back to your site, but they will also give search engine bots incentives to crawl your site more frequently, and index you accordingly.