The Need for a New Gauge: Moving Beyond Visual or “Toolbar” PR Rankings

Contrary to popular opinion, visual PageRank is not the best tool for evaluating the popularity of specific Web pages. Not only is it often out of date, but there are important factors that it does not take into consideration. This article discusses why so much attention is paid to visual PR, as well as alternatives to examine when trying to determine a Web page’s true value.

“Man oh man! When are we going to get a PR update?
“Ugh! I’m tired of waiting for it. The hours are turning into days. What exactly am I waiting for anyway?”

There are really only two viable reasons for the fervor associated with the infamous PR update:

  1. People want to improve their visual Pagerank so that they can improve their bargaining position with regards to reciprocal link exchanges.

  2. People want to improve their visual Pagerank so that they can improve their bargaining position with regards to the selling of text link advertising.



Both of these reasons are tied to the belief that visual PR is the best gauge for evaluating the potency or overall SEO value of specific Web pages. Visual PR has a long-standing reputation for being the “be all and end all” tool for determining the SERPs value of a Web page. Webmasters have become accustomed to consulting visual PR data in order to determine the value of a link from a specific Web page, but the truth is that there are other, much more accurate means for determining the SERPs value of a link from a specific Web page.

Think of it this way: if you were doing an SEO analysis of a Web page to determine its “potency” in terms of SERPs, would you rely simply on PR value? Any established SEO webmaster will surely answer “Hell No”!



So why do webmasters insist on using this gauge in order to determine the worthiness of a link exchange or text link ad? I honestly have no idea, but I’ll give it a shot anyway:

  1. Familiarity – One of the first things that newcomers to the SEO discipline learn about is Pagerank. They install the Google toolbar, activate the little green bar, and off they go to evaluate websites. Next thing you know they’re complaining in some SEO forum about how Google is irresponsible for not updating the green bar more often.

  2. Laziness - It’s much easier to refer to a Web page’s PR ranking instead of taking the time to evaluate factors such as backlink data (in Yahoo or MSN), outbound links, link location, SERPs of the page in question, and other important factors. People are always looking for shortcuts, and the little green PR bar provides the ultimate shortcut in terms of evaluating a web page’s SERPs value.

  3. Ignorance – Many webmasters simply do not know that, among other things, visual PR has absolutely no bearing on SERPs (Google calculates “internal” PR data for all indexed Web pages, but does not share this data with the public) and does not factor “relevance” at all.

  4. No alternative - There are no alternative “third party” tools that properly gauge the SERPs value of a specific Web page. Until someone (or preferably several different entities) develop a ranking tool that supercedes PR in terms of evaluating SERPs value, webmasters will continue to rely on the good old green bar.

    Note: There have been some faint rumblings that Google is actually in the process of refining visual PR in order to make it more “useful,” but I’m not holding my breath.




So what’s a responsible webmaster to do? Reciprocal linking and text link advertising are not going away, so there needs to be a more sane method for evaluating Web pages. Here is my suggestion:

The SEO community as a whole must take steps to change the current mindset and move away from utilizing PR as the predominant method for establishing the SERPs value of a specific Web page. Until someone comes up with a more comprehensive third party tool, or Google decides to improve upon the current visual PR concept, here is a checklist of things to look for when evaluating a Web page:

  • Backlinks (link:http://www.yourdomain.com) – Yahoo is the best place to look. Disregard Google’s data. They purposely offer incomplete backlink data when you use the link command in their engine.

  • Link location – Is the link on the footer? Is it on the left or right rail? Is it up near the top or the center of the page? This is an important factor because Google and the other engines are beginning to award varying weight to the content on different regions of a given Web page.

  • Number of indexed pages (site:http://www.yourdomain.com) – Make sure to run this query in Google. The more pages indexed, the better.

  • Freshness of page or site as a whole – Does content get updated often or does this site have the same content that it had in 1997? The fresher the content, the better.

  • Google’s site flavor tool – If your not familiar with this tool, go to this url for more info: http://www.google.com/services/siteflavored.html 

  • Relevancy and traffic – This one is a “no brainer.” Does the site have solid traffic and is it relevant to your site? Use your common sense here.

  • SERPs for specific search terms – This is critical. SERPs are the true measure of a Web page’s search engine “potency.” They also give good insight into the relevancy of the Web page.

Now let’s get back to the idea of developing a third party tool that is not affiliated with Google or any of the other engines. My buddy Rand (better know as “randfish” within the various SEO forums) let it be known that he was developing such a tool right as I was about half finished with this article. So I decided to try and land a quick interview with him, so that he could discuss some of the finer details regarding this project. Here is a transcript of our conversation:

Question: How long have you been in the SEO industry, and what type of business do you engage in (affiliate revenue, advertising revenue, professional SEO firm, in-house SEO for an established company)?

I’ve been in Web development since 1996, but I got into SEO in 2000 (working with other companies and people) – I’ve only been around the forums (and doing the hands-on SEO work) since 2003, and I launched SEOmoz in 2004. Our business model is to run every part of the Web business for several companies (in different sectors – retail, finance, software), so we manage design, development, promotion, SEO, usability, the works.

Question: Why do you think that the SEO community in general turns to visual or “toolbar” PR when it comes to gauging the value of a link from a specific Web page? Also, in your opinion, why does visual PR fall short in this regard?

PR was, at one time, a reasonably good way to measure the value of a link in terms of how Google ranked sites. PageRank by itself virtually dictated the order of results for several years after Google’s launch, until they became a more sophisticated search engine. Right now, link buyers probably do this out of both habit and laziness. It’s a very easy way to get a rough idea that the site isn’t banned by Google and has a relative degree of link popularity. PR is still good for this last system – it can give a very rough, relative estimate of the link popularity for a given site/page. What’s unfortunate is that so many link buyers (and probably sellers too) don’t realize that this is not what governs the value of links anymore. A highly linked-to site that also links to you no longer provides the boost it did in 2001 or even 2003. Today, link analysis sophistication has been taken to a far greater level, and that’s where toolbar PR falls short.

Question: What was the last straw that convinced you to undertake this project of building a third-party tool that could gauge the value of a specific Web page?

I believe there was actually a thread at SEOChat forums that convinced me. There was a lot of discussion on what to measure, etc. and I thought that a tool that automatically pulled as many of those factors as possible would be useful. The tools I make aren’t the “end all, be all;” they’re just designed to pull information automatically to save time and effort for SEOs. Many times, especially on days when the tools are popular or linked to by a big SEO site, the wait can be 20 or even 30 minutes, but it’s still time when you can work while it sits in the background and runs (plus it’s free).

Question: How long have you been working on this project?

I’ve been working on SEOmoz since the fall of 2004, including the tools. The link pricing tool specifically has been in development for just under three weeks, and I’ve only recently put together the calculations that will be used to grade the factors (which involves lots of nasty equations to get a scale that feels right).

Question: Can you give me a little insight into some of the features that this tool will possess? Why are webmasters going to be compelled to use it?

It’s going to be scraping a lot of information that you would normally attempt to pull yourself when deciding whether to buy a link. As I said before, the information it returns isn’t completely authoritative, but it can help to give you an idea of what to do. The tool can’t take into account how many visitors the page gets or how many will click on your link, so it’s really just designed to estimate the value in terms of link popularity at the search engines.

Probably the most compelling reason to use it will be to give a rough idea of the range a link is in – like the kw difficulty tool which I use all the time to gauge the range of effort required to rank, this will provide a similar rough estimate of what to pay.

Google+ Comments

Google+ Comments