Designing Websites For Humans In A World Of Robots!!

In this day and age, it can be easy to forget the basics of why your website is online. Crawlers/Robots, they come, they go, but they never pay. Thats where your visitors come in.

With the ever increasing number of web pages & documents available on the internet, it has become difficult to find information fast and without having hundreds of advertisements thrown into our faces (most of which will have no relevancy to the information we are seeking). There is quite simply no realistic method to finding material on the internet other than using search engines. At this point, everyone should realize that search engines use “robots” in order to “crawl” through the internet and collect web pages & other documents. The search engine will then use these documents to make up the engines “index” or “database”. This in itself is not a problem, but to every action there is an equal & opposite reaction (Newton’s 3rd law of motion!).

With this expansion of information on the web, which has driven more people to use search engines on a daily basis, it has become a requirement for the search engines to become more active in order to keep their database up to date. This means crawling more web pages at a greater frequency. Website owners have indeed noticed this increase in activity, and they have not stared at it blankly in the face, they have reacted. They now realize that these search engines are producing significant percentages of their traffic (up to 90% in some cases). So what to do’

Again, with the expansion of the web, there has also come more competition in essentially every industry, from computers, travel, food, right down to buying pets online. This competition is healthy in that it has pushed prices lower, but this very same competition has indirectly lowered the overall satisfaction level of website visitors. Let me explain more..{mospagebreak title=The Full Potential&toc=1} Firstly, let us realize the potential of search engines in today’s society, we all know that “Google'” currently powers “Yahoo!'”, and we all know that “Overture'” provides it’s listings to may search engines. Google, being the prime focus of my current work, pulls in an amazing 150 million different search queries everyday, or at least so they claim. Add that on to Yahoo’s many million searches per day and we’re talking about a lot of traffic. This is important because these Google results are free to be listed in, meaning that with the right properties, your website has the potential to hold the number 1 position for any given search term regardless of money (ignoring the sponsored links).

With such a great deal of traffic in search engines, everybody wants to be listed in the number 1 slot. This is the nature of competitiveness. This forces webmasters & SEO’s (including myself) to analyze Google and other search engines to determine what properties are required in order to rank number 1 on the results pages, this in itself is still not a problem, but when some conclusions are derived from the analysis of the search engines, we are usually left some with a result requiring some page modifications in order to improve the ranking of those pages, and this is where some webmasters & SEO’s get confused.

In the attempt to modify their pages to rank number 1, webmasters will often sacrifice how usable a page is to human visitors, and this is the most lethal mistake anybody could make. When a robot visits your site, it will see a whole pile of text, and based on that text, will rank your site for different terms. This text will have no formatting, in the graphical layout sense, although the weight of some text is measured differently.{mospagebreak title=Balance&toc=1} When a user visits a web page, they have the full effect of layout and graphical interfacing, and this graphical interfacing is very important in how visitors will use our site, and ultimately, buy products, hence generating revenue. If one were to compromise too much in terms of what the visitors see, we might have a website that simply will not produce profit, even if it does rank in the top slot for competitive terms at some of these larger search engines. This is a condition that I refer to as “Over Robot Optimizing”, and is a common practice with some websites on today’s internet.

There is a balance between optimizing websites for robots and keeping a site convenient for users, although sometimes this balance is hard to achieve. With the correct balance, a website ranking at number 6 in search engines might easily out-perform a more highly ranked competitor in terms of revenue generation and conversion rates.. Furthermore, advertising cost will be lower in the long term, leading to higher profit margins.
{mospagebreak title=Guidelines&toc=1} Website Designing Guidelines :
Designing websites for humans is a far wider topic that can be covered in the scope of this brief article and will vary by website. That said, you might find these general guideline helpful:

1. Provide headers on each page so that your visitors can see clearly what the page they have loaded relates to. This header should be the largest text on the page.
2. Content text should be no more than 1/3 of the size of the header, this will ensure that the page is not too “monotone”.
3. Navigation menus should be very clear and easy to use. This should be presented on every page, and the user should not have to rely on another source (such as a framed page) for navigation.
4 .Pages should be as light on the images as possible, as many people out these still use 56k, and loading images takes time on a 56k.
5. Pages should be no more than 40k in size (html coding size) unless they are papers e.g. technical studies / technical papers / specification sheets.

Google+ Comments

Google+ Comments