Putting it all together
Let’s say you have a recipe site. You have free recipes, but you also have paid recipe subscriptions that your customers can sign up for, or recipe cards and books to order. Your target audience is women, families, moms, dads, grandmas, and newlyweds. Your web stats indicate that you have visitors that come from Recipes.com, and land on a chocolate chip cookie recipe. You determine from your web stats that this page was viewed between the visitor’s local time of 6:15 pm and 6:30 pm . You determine that your visitor viewed the recipe for about 2 minutes, then clicked a few links in your site for a couple of minutes, then browsed your quick meals category for about 5 minutes.
However, you also determine that this visitor left this category after only viewing 2 other recipes without downloading any, having lingered on the second for longer than the first. Your visitor did bookmark your site for later. So what other things can you probably determine about this visitor? Well, your visitor is probably a woman, seeking to make her family dinner, but needs something quickly. She probably has kids home from school, husband home from work soon, and interested in the cookie recipe, but not right now. She’s in a hurry, looking for something particular? Perhaps.
What can you determine about how this user’s visit could have been more effective? You probably need to look at your site navigation. Did she have trouble finding the category she wanted immediately? Were there too many steps to take to get there? And once she was there, did she have a search capability? You know she’s interested in viewing your site more, because your site is bookmarked for later. But why didn’t she download a recipe at that time? You could look at the two pages she checked out. Were they easy to understand? Were there popup ads that drove her away? Maybe she wished to download her recipe without giving her personal information.
It’s difficult to tell these things about a single visitor. However, comparing your finding to the browsing habits of other visitors to this one will give you a better understanding of trends visitors have in your site. This will show you areas where you need to make improvements on your web site.
ROI, or Return on Investment, is so important in tracking advertising costs. You need to know which advertising avenues are paying off, and which venues are simply wastes of money. Being able to determine if links from those ads are bringing in buying visitors, or just surfers, will show you how effective those ads may or may not be.
Conversion ratios are important to monitor if you are involved in e-commerce. Conversion ratios are the number of sales divided by the total number of visitors to your site. I will touch on conversion ratios more in my next article; however, I do want to point out one thing in particular. You have to have a good understanding of your web stats in order to have any effectiveness in your tracking efforts. Without that understanding, you are simply wasting your valuable time. Not only that, but you have to use that understanding to make a big picture. Web stats are like a puzzle: each piece only giving part of the picture. Putting it all together gives you the whole picture in the puzzle. Tracking your progress in charts or graphs is a good way to follow the browsing and buying habits of your visitors as well.
When you combine your gathered results in the way you need to, there are endless possibilities of calculations you can make. There is no limit to what you can learn and track.
Using the data you collect to track your success or lack thereof means you have a better chance of improving upon what is working for you, and correcting what is not. I mentioned “stickiness” earlier, and you can find out how “sticky” your pages are. There can be many things that affect stickiness and how long a visitor stays in your site, such as ease of use, navigation, lack of broken images or links, image load times, and even graphic design.
Learning how your site is attractive to search engines can be done through your web stats. Search engines are visitors too. Many web stats programs track the visits of search engines differently than that of human visitors. Some stats track all visitors, including search engines. If this is the case with your stats program, you can still determine which visitors are search engines, and which are not.
Many of the search engines can give you IP addresses of their robots and spiders. Simply browse the search engine’s help pages for that information. Comparing IP addresses against the data your web stats program collected can help you separate out the search engine spiders. Also, search engine spiders and robots tend to fly rapidly through links and pages on your website in a fashion that humans do not. If you are determining that there are 30 page views from this visitor in one minute, then the chances of this being a spider or robot is very high. The navigation paths of the search engines will also give you clues as to which type of visitor this is. Many search engines will stop at secure pages like login, checkout, or account creation; then they will move on to other pages.
Search engine spiders and robots will crawl your website, amassing many pages. If your average page views per visitor is 5 or 6 pages, and a visitor is browsing hundreds of pages at a time, then you can be relatively sure the visitor is a search engine. A dead ringer for search engine spiders is a request to the robots.txt file. Since the reputable spiders will request this file with every single visit, it helps you determine frequency of their visits.
Now that you’ve determined which visitors are search engines, you need to look at, compare, and chart the data. A few questions also about your human visitors should come up during this charting process as well. Are visitors coming into your site from search engines? Which pages are those visitors landing on? Those are the pages that have been indexed by the search engines, and therefore how those visitors found you. You should also determine which keywords your visitors used to find your site, and compare those to your keywords that you are tweaking for each of your pages. You can also determine which landing pages are the top requested from the search engines. You can be reasonably sure at this point that those top requested pages are being ranked well. You can also determine which pages are not being ranked well, and why.
You should also look for search engine spiders using sessions. Search engine spiders do not accept cookies, or private files on a user’s hard drive, so sessions are used for those who cannot or won’t allow cookies. If spiders are not prevented from using sessions, then your pages could be indexed with those session id’s attached to the end of your URL. For example, if your page to be indexed is:
and a session id is attached to it, making it look like:
then when it indexes the URL with the session, this will cause problems. If a spider has indexed a site with the session id, it will follow its databased link, only to encounter a 404 Page Not Found error when it cannot recreate that session. Your web stats will be able to alert you both with the pages that were crawled by the spider, and what errors it has encountered.
The solution to this problem would be to install a script that would not allow spiders to create session ids. You may need to view your shopping cart user’s manual for more information.
Even knowing the approximate times of day that the search engine spiders crawl your site can be helpful. For example, Google usually crawls our website at 11 pm central time. What I can determine from this statistic is not to upload new files or scripts that it will have trouble crawling during this period of time, and either do it before the potential crawl times, or wait until after. There is no way to know when a spider will crawl your site, but analyzing your web stats can help you be more prepared, and help you understand the crawling patterns of each search engine.
By now, you should understand your web stats, what they mean, how to put them together to give you even more detailed information, and how to read them in regards to search engines and search engine keyword results. There are a few tips I would suggest, helping you in your endeavors:
- Make sure your stats program covers all the basics. You will be left out in the cold if the main concepts of web statistics are not covered in your program. The necessities of web stats would be: number of new or unique visitors, number of page views, referring URL, error reporting, entry and exit pages, top requests, and visitor navigation paths.
- Make sure you have the ability to determine search engine behaviors. If your web stats program doesn’t track search engines separately from humans, then you may want to consider switching stats programs. You will save yourself valuable time without having to make all the calculations to determine which activities were created by search engine spiders.
- Check your stats frequently. It only takes a single broken link to affect your traffic or sales, so make sure you are aware of what is happening with your site several times a week.
- Check your stats regularly. By having scheduled or regular times to monitor your stats, you have a better ability to compare apples to apples, rather than apples to oranges. If you check your stats every Saturday at 8 pm , then compare those to last week at 8 pm, then you can chart your progress at that time. Be aware that daily stats will vary, so it’s important to spread out your results over a regular period of time, like over a month, or even several months.
- Know where your inbound links are. This way, you can better monitor the usefulness of those links when comparing results from your referring URLs.
- Keep spiders from creating session ids, or create 301 redirects from sessions in your .htaccess file.
- Actually use your results. It doesn’t do you any good to view, compare, and chart your results if you don’t intend to use the information to update your site. Information is only as powerful as the person who puts it to work.
- Be willing to make changes. Along the same lines as using your results, you have to be willing to trust your results, and make your alterations accordingly.
- Use your stats to whittle out unqualified traffic. It’s difficult for a web master to give up traffic, but downsizing the volume of traffic in return for targeted traffic will allow you to make better comparisons of your web stats to find out who is really buying and why; as well as to use the information to find out why others are not buying. This means getting rid of ineffective landing pages, and removing links that don’t bring you qualified traffic. You may find that removing irrelevant links will give you a better position in the search engine results pages, too.
- Allow your marketing or SEO consultant to have access to your stats. A professional may be able to mark trends in your statistics that may escape the untrained eye. They will also be able to give you a new perspective into what is effective marketing for your site, and what needs to be changed.