The Lowdown on Keyword Density

"Keyword” is a term often used by various people; however, few people actually understand what keywords have to do with websites and search engines. It may seem apparent that “search” and “engine” would be keywords for Google, but can we actually apply a definition to “keyword” in general that will help us understand it?

A keyword is actually-more or less-any word that appears on a web page. These are the words that make up a web page. Whenever you read a website, you are actually unconsciously processing all of the keywords on the page. The words that stand out most to you are most likely the “key” words.

Although this concept seems relatively straightforward to the human brain, it is actually somewhat difficult to quantify the value of a word relative to the rest on the page. Initially, it was thought that the ideal method was to simply count the number of times a word appeared on the page. The more times it appeared, the more important the word was.

However, what happens if, knowing this fact, somebody making a web page simply repeats the same word over and over on a web page?

The study of keywords has come a long way since the original development of the Internet. There were many flaws in keyword interpretation and analysis in the old days, but the technologies and methods have been developed to the point that keywords can be automatically detected and ranked accurately.

Since the detection algorithms have become more complex, it has become more difficult for people to fool the search engines with tricky tactics. However, there are still methods that can be used to organically produce better search engine results by utilizing the known methods of keyword density analyzers. These are known as white hat methods. The unscrupulous methods mentioned before are, likewise, named black hat methods.

The goal of search engines has always been to return useful and relevant information to whoever is performing the search. The technology was limited in the beginning, and it was more or less a toss-up on whether the results you got first were actually the most useful to you.

However, most people recognize now that the first page of results on Google is almost always going to be the one you are looking for. The technology has advanced over the years to keep up with growing needs and techniques.

Although search engines have always utilized multiple methods of ranking search engine results, keywords have traditionally been one of the most prominent. As previously mentioned, search engines of the past would simply count how many times a certain word shows up on the page and then turn up results accordingly. This was not working out, again for reasons previously mentioned. People were all too aware of this fact and took advantage of it for their own gain.

Now, search engines have begun to utilize much more advanced techniques. They look for “hidden” or overly repetitive words and actually count them for a negative value, penalizing people trying to cheat the system. They add positive value for the relative importance of words; for instance, a word in the title of the page has a great deal more importance than a word in the very bottom footer. There are many other intelligent algorithms that they now use to ensure the highest quality of results at the top.

This new breed of search engine has also inspired a new breed of developers looking to get the most out of their keywords. They have studied the new algorithms, both by reading up on technologies and performing their own research. They have uncovered numerous methods, both white hat and black hat, of making their web pages show up higher in search engine rankings.

For their part, search engines have actually put in effort to shape these methods. They have introduced custom pieces of code that have become official that will allow users to provide more information to the search engines to more accurately index web pages within a search engine.

They have also put a lot of time into documenting both white hat and black hat methods, and specifying how they reward and punish each. Although this does allow for very dedicated and intelligent people to cheat the system for a short time, the overall effect is beneficial to the web development community.

Black hat search engine optimization methods are, in a word, unscrupulous. They are called black hat because they are deceptive and work against the intended mechanism of the search engines. Search engines are designed to give a higher ranking to higher value pages. With the advanced modern technology, this is most likely going to happen regardless of what optimization methods are used. Typically, a truly valuable website will make its way to the top.

By taking advantage of the system, black hat optimizers are pushing their websites ahead of more worthy and worthwhile ones. The result is that users who are trying to look for valuable content instead come across relatively worthless websites that have had expensive black hat techniques applied to them.

While the method was effective in bringing the page closer to the top of the search engine results, it will most likely not be effective in keeping people on the website. If the website had something to offer, then it would be able to achieve high results without poor techniques.

Black hat techniques are also only effective in the short run for getting a website to the top of search engines. Due to their nature, over time the websites will fall in the rankings and will likely never be able to return. If a search engine recognizes the black hat techniques being used on a website, then it will penalize that website and may even punish it permanently. The damage could be irreparable for a small Internet business.

The principle as to why black hat techniques are bad is that websites should not have to be dishonest to attract visitors. There should be true value to a website. It should offer something that people really want to have.

That value can then be marketed traditionally and honestly. If the website is really valuable, then it should succeed through these honest methods. If it cannot succeed this way, then according to Darwinist theory it has no right to succeed. Changes should be made so that it truly is a valuable website.

Black hat techniques simply lend the false appearance of value to a website. Even if they fool the search engines, chances are that they will not fool the visitor. People can normally tell within seconds whether or not they will be able to get what they want from a website.

Not all methods of utilizing keywords are unscrupulous. White hat techniques are recognized by web development professionals and search engines alike. Rather than lying about the content of a web site, it is more like organizing the content of a web site in such a way that it best markets it to both people and search engines.

This is not entirely obvious if you are not sure exactly what the search engines are looking for. The good news is that quite a bit of it is done naturally if you just develop a website for visitors.

One search engine measurement tool that has traditionally been considered valuable (but is somewhat less so today) is keyword density. The idea behind this is that search engines look to see how many times a keyword shows up on a web page. Now, they also take advantage of synonyms and intelligent analyzers to give more value to the pages that really deserve it. Different algorithms differ, but they all take a look at the density of the keywords on the page.

Different search engines give different values, but it has been suggested to have anywhere between two percent to eight percent keyword density on every web page. That is, between two and eight of every 100 words should be a keyword. This can be difficult to measure with words that are not quite key words or ones that have similar meanings, but there are free tools online that can help you if you are having some trouble.

A better way to gain value with keywords is by placing them in prominent locations on the web page. The URL, title, and headers are all great places to get keywords noticed by both users and search engines. This is probably something that most people already do for their human visitors, so it is not really a big concern.

One neat technique is keyword proximity. Two keywords next to each other are more valuable than when they are apart. For instance: “small business website design” is more valuable than “we do website design and work with small business.” People are more likely to enter the first phrase into search engines, which will then return results as close to the searched phrase as possible.

In summary, it is best to utilize white hat techniques when considering keyword density for a website. They ensure that both search engines and users will get the most value out of your website.

Google+ Comments

Google+ Comments