Top 21 SEO Tips for 2007 (unedited)

 Top 21 SEO Tips for 2007

by GarytheSubaGuy

http://www.devhardware.com/forums/search-engines-marketing-86/top-21-seo-tips-for-2007t-153862.html

Notes: needs clarification as it doesn’t seem to be part of the top 21 series. Can be run with a different title except for the top 50 tips section. We already ran some articles using those tips. We could probably remove that section and link to those articles instead; this article is long enough that it will probably *still* be long enough to get a full article out of it after that.



Link building is the single most important element to obtaining high rankings in all of the major search engines. It is vital that continual efforts be made and long-term plans be laid out to insure a web sites continued success in organic search results, and reduced costs in paid placement (PPC).

Google created the most successful information retrieval device of all time based on sending spiders to follow each and every link they can find on each and every web document they come across. Yahoo, MSN, Ask, and all the other search databases have acquired the vast amounts of information they contain in similar fashion. Links play important roles in the ranking formulas of all search engines, especially Google, by providing numerous pieces of data for their algorithms to chew through.

The best links a web site can have are natural, one-way inbound links. These are links that are posted by other web sites, forums or blogs. These show a natural interest in something the linked web site offers such as valuable information, news, a tool or some other resource.

The more one-way links a web site has, the more reliable the search engine algorithms consider it to be. Google has gone as far as to rank a web site in terms of PR, or page rank. This is a sliding scale of 0-10. The more important Google considers a web site, the higher the PR that it awards it. (PR also includes visitors as well).

You can check the number of back links to a web site in many different ways. The Firefox browser has an installable extension that allows users to “right-click” and scan down to “back links” to see the number of back links a site has. There are several toolbars that you can install (Google, Yahoo, etc.) that allow you to see this, and there are various web sites that offer tools to do this.

(Google is unique in its approach to back links as it will only show a percentage of the actual back links, whilst Yahoo and MSN show all. Google also will delay showing back links in order to attempt to weed out purchased back links or schemes to affectively fool the algorithms into awarding a higher PR, and thus a higher position in the SERP’s – Search Engine Ranking Positions.)

These will check the number of back links that a page has: http://www.iwebtool.com/backlink_checker

http://www.searchenginegenie.com/backlink-checker.html

This will check the number of back links that the top 10 sites have based on your selected keyword (this will help you find relevant sites);
http://www.webuildpages.com/seo-tools/whoischeck-bykeys.pl

Or you can even use the free version of IBP – Arelis to do this. Arelis will;

  1. Search for Link Partners by Keyword

  2. Search for Link Partners by Finding Who Already Links Back to You

  3. Find Out Who Links to Your Competition

Types of Link Strategies

Natural Link Building – Adding quality content or something that benefits the end user that they would want to link to

One-Way Linking (Purchase) – Buying one-way inbound links to your web site

Reciprocal Linking – Exchanging links with another web site

Link Farms – Companies like linkmarket.net (but not directories, FFA’s, or obvious abusers of linking)

Three-Way Linking – Site A links to Site B, Site B links to Site C, and Site C links to site A ( www. three-way-links .com/)

Forums and Blogs – Links from forums and blogs

News Articles (PR Web) – Typically created by web site owners to promote their site. These are effective after 2-4 weeks when Google has crawled them and indexed them within their search results. Never put more than 1 link to any one page per article.

One of the tools mentioned above, linkmarket.net is a good tool, which has spawned many other linking tools that do similar things.

Here’s how it works; you search through their categories for relevant categories. Once you drill it down to the category and click on it, a list of other members will come up as well as their Google PR. You add their link to your website and send them a request. This request will also provide a link for them to insert into their web site. The downfall is that you need to check that the link remains there or even that it’s placed in the first place. This is where the work begins.

You need to track all of the links to verify they aren’t taken down. There are tools (Web CEO for one), that will do this for you, but you will still need to record the link page URL so that you can enter it into the tool so it can do the check.

There are many ways to gain back links from a web site. You can offer valuable information on something that an end-user finds useful, such as a map to, or of a destination, a tool such as a mortgage calculator, or even a coupon or shopping tips. This is the way the search engines want you back links to occur…as this is the Natural Link Building process; An end-user finds something on a web site that they feel is useful and they create a link to it.

Another method is purchasing One-Way Links . You must be very careful when attempting this strategy as many things can go awry, and the search engines (especially Google), are looking very hard at how to avoid awarding web sites higher SERP’s based on link building efforts attempting to obtain a more favourable position in their search engine.

Whilst Google Page Rank doesn’t directly affect your SERP’s, the back links from trusted sources do. The way this works is that Google looks at the PR of the referring web site and passes on PR. The influence of this “bleeding” affect is determined by:

  • The PR of the referring site

  • The number of outbound links on the page containing your back link

  • The “trust” rating of the referring web site, according to Google, which is based on the registration date and consistent content, as well as the web sites own back links and these same parameters

This, put in basic terms, means that spending the time that it takes to obtain a back link from a site that has no PR is minimal.

Here is an example of Google’s “weightedness” (a made up word by Gary);

Site 1 with a PR5 has 50 links (the max you want on 1 page) = bleeds .0012 PR

Site 2 with a PR5 has 10 links = bleeds .430 PR

Site 3 with a PR5 has 2 links = bleeds .776 PR

Additionally, Google seemingly awards back links from .org’s slightly higher, and back links from .edu’s and ‘gov’s significantly higher. This opens many vertical possibilities when taken into consideration whilst planning your long-term back link strategy. Ask me about these if you’re willing to do a lot of hard work.

The following is the same example above, but is based off a back link from a .edu and a .gov

.org/.edu Site 1 with a PR5 has 50 links (the max you want on 1 page) = bleeds .4352 PR

.org/.edu Site 2 with a PR5 has 10 links = bleeds .88721 PR

.org/.edu Site 3 with a PR5 has 2 links = bleeds 1.176 PR

So this means that it is important to get back links from high PR sites, as well as sites that have related content.

The Link

Just as important as the back links, the actual content of the back link is as important.

Because of the overwhelming problem the SE’s are experiencing with Spammers and Black-Hatters overtaking their results and therefore skewing the quality of the overall purpose of their primary intended function, which is search and providing relevant results, each of the main three search engines have introduced, or are soon to introduce an entirely new algorithm that, in purpose, is meant to eliminate the bad, and provide genuine, relevant results, which is what the end-user is looking for.

So Google tweaked their algorithm to place an increased weight on not only back links, but the actual content that surround the back link. So be sure that when you spend all this time on link building you do it the right way. Put your link inside relevant text, with other keyword strings that contain your primary keyword set.

What this means is that if I were optimising a web site and one of its keyword phrases were “debt consolidation”, I would create a back link that used “debt consolidation” (actually I would use “ Get Debt Consolidation” because you need a ‘grey’ word before your keywords in ANY circumstance when doing optimization to avoid obvious SEO red flags), and the link description would also include that phrase. So, a good example of this is here:

Expert Debt Consolidation – Get Cheap Debt Consolidation Now.

This is a basic example. Every web site and back link offer/tool will have different parameters stating how many characters you can use, the length, content, number of caps, number of expletives like “best”, “cheapest”, or “lowest” type. The point I am making here is that you need to take full advantage of the link. You do this with carefully selected anchor text and descriptions. These links need to be carefully created and linked back to SE optimized landing pages that mirror your anchor text and description. These elements are EXACTLY what ALL search engines, especially Google, use to weight or grade the link.

This, coupled with quality content and the correct keyword density and other SEO elements, are core in the future of obtaining high rankings with all SE’s organically, and PPC at a cost well under what the competition is paying.

Link Tools

Alexa (Part of IBP 9.2), WebCEO and many other tools are available that work in an efficient way, and can be very effective if utilized in the correct fashion. These tools will take your selected keyword and based on the parameters that you set up, crawl the search engines and the top ranking web sites that come up for that particular query. They then pull any available emails from the site, if available, or if there isn’t one available, it will default to whatever you select (i.e. webmaster@ or info@).

So lets say you are searching for back links from sites that are related to women’s under garments for Bravissimo. I would enter “women’s clothing” into the search box and these tools come back with the number of sites that you request. The tools give you the amount of back links a site already has, the PR strength, a relevancy grade and so on.

These tools have other optional settings to help in your link building schemes.

  • Find web sites to link to by keyword

  • Find web sites that link to your competitors

  • Find web sites that already link to you (to possibly change the anchor text or add additional deep links)

IBP , for instance, will scrape the results of these searches and scan the website for email addresses. This is a big timesaver.

Investigate the many tools available to find the one that suits your needs the best. Stay away from the cookie cutter approach if possible as link building has been going on for years and most web site owners have received thousands of “canned” requests over the years.



Things to avoid when link building

  • Stay away from link farms ( http://www.jimprice.com/jim-lnk.htm#people )

  • The site has no possible connection to your subject matter whatsoever. The page they put your link on isn’t linked to FROM any page, meaning it’s floating out there in never-never land and is a ploy to get you to link to their site.

  • The page where they put your link is on a URL a mile long and several directories deep so engines will never find it.

  • The page looks like a farmer’s field with nicely arranged rows of links to hundreds of sites which aren’t necessarily organized in any logical manner, but that doesn’t matter because someone told them the link is all that counts.

  • It’s a link and a link only. No description. No proof the person ever actually reviewed the site.

  • Signs they’ll accept anything that shows evidence of being a "live" link. A true Directory has criteria, frets about the quality of sites it links to and doesn’t have people out begging for links. Instead the reverse is true, with people begging to be let in.

  • Watch for scams such as sub-domain one-way traffic feeders where the page your site is linked to isn’t part of the main website. Study the URLS carefully before you decide to accept a link request.

  • Stay away from FFA sites (Free For All)

  • Avoid being on a web site that has pages and pages of links. This is viewed as a Link Farm.

  • Stay away from sex oriented, gambling, RX and other unsavoury sites.

  • Be aware of the possibility of bad neighbours. If you are on a shared server, do a blacklist check to be sure you’re not on a proxy server with a spammer or banned site. (This is a tool available in Tip #20)

  • Don’t waste your time getting a link from a non-ranking page within a site. The page needs to hold a rank of a minimum PR value of 1 below your landing page, particularly if there are going to be other outbound links to other web sites. If there are not going to be other outbound links, or just a few, then a PR of 2 and above will still boost your ranking and benefit your SERP’s as well as your own PR.

  • Stay away from link pages called “Link Partners”, “Links” or the like, especially if the term “link” or “links” is part of the URL

  • Stay away from pages that have more than 50 outbound links

If you are looking to build long-term rankings, it takes more work and creativity than just sending out automated emails or joining a linking program. Create a daily “hit list” outlining exactly what you will do.

Don’t be afraid to pick up the phone. This is the best way to get and keep a link. You can usually find this information at Network Solutions or the web sites “About Us” page.

Lastly, keep at it! Link building is a marathon, not a sprint. You’ve been given what is probably the most important job that influences search engine results. The work you do today, will put a web site at the top of the rankings tomorrow, and keep them there.

50 Ways To Get Links

1. Build a “101 list”. These get Dugg all the time, and often become “authority documents”. People can’t resist linking to these (hint, hint). Like mine at http://www.ppc-manager.blogspot.com . I did a PPC 101 and PPC 102 lists.
2. Create 10 easy tips to help you [insert topic here] articles. Again, these are exceptionally easy to link to.
3. Create extensive resource lists for a specific topic (see Mr Ploppy for inspiration).
4. Create a list of the top 10 myths for a specific category.
5. Create a list of gurus/experts. If you impress the people listed well enough, or find a way to make your project look somewhat official, the gurus may end up linking to your site or saying thanks. (Sometimes flattery is the easiest way to strike up a good relationship with an “authority”.)


Developing Authority & Being Easy to Link To
6. Make your content easy to understand so many people can understand and spread your message. (It’s an accessibility thing.)
7. Put some effort in to minimize grammatical or spelling errors, especially if you need authoritative people like librarians to link to your site.
8. Have an easily accessible privacy policy and about section so your site seems more trustworthy. Including a picture of yourself may also help build your authority.
PPC as a Link Building Tool
9. Buy relevant traffic with a pay per click campaign. Relevant traffic will get your site more visitors and brand exposure. When people come to your site, regardless of the channel in which they found it, there is a possibility that they will link to you.



News & Syndication
10. Syndicate an article at EzineArticles , GoArticles , iSnare , etc. The great thing about good article sites is that their article pages actually rank highly and send highly qualified traffic.
11. Submit an article to industry news site. Have an SEO site? Write an article and submit to WebProNews. Have a site about BLANK? Submit to BLANKinformationalsite.com.
12. Syndicate a press release. Take the time to make it GOOD (compelling, newsworthy) . Email it to some handpicked journalists and bloggers. Personalize the email message. For good measure, submit it to PRWeb , PRLeap , etc.
13. Track who picks up your articles or press releases. Offer them exclusive news or content.
14. Trade articles with other webmasters.
15. Email a few friends when you have important relevant news asking them for their feedback and/or if they would mind referencing it if they find your information useful.
16. Write about, and link to, companies with “in the news” pages. They link back to stories and blog posts which cover their developments. This is obviously easiest if you have a news section or blog. Do a Google search for [your industry + “in the news”].
17. Perform surveys and studies that make people feel important. If you can make other people feel important they will help do your marketing for you for free. Salary.com did a study on how underpaid mothers were, and they got many high quality links .

Directories, Meme Trackers & Social Bookmarking
18. This tip is an oldie but goodie: submit your site to DMOZ and other directories that allow free submissions.
19. Submit your site to paid directories. Another oldie. Just remember that quality matters.
20. Create your own topical directory about your field of interest. Obviously link to your own site, deep linking to important content where possible. Of course, if you make it into a truly useful resource, it will attract links on its own.
21. Tag related sites on sites like Del.icio.us . If people find the sites you tag to be interesting, emotionally engaging, or timely they may follow the trail back to your site.
22. If you create something that is of great quality make sure you ask a few friends to tag it for you. If your site gets on the front page of Digg or on the Del.icio.us popular list , hundreds more bloggers will see your site, and potentially link to it.
23. Look at meme trackers to see what ideas are spreading. If you write about popular and spreading ideas with plenty of original content, (and link to some of the original resources), your site may get listed as a source on the meme tracker site.

Local & Business Links
24. Join the Better Business Bureau.
25. Get a link from your local chamber of commerce.
26. Submit your link to relevant city and state governmental resources. (easier in some countries than in others.)
27. List your site at the local library’s Web site.
28. See if your manufacturers or retailers or other business partners might be willing to link to your site.
29. Develop business relationships with non-competing businesses in the same field. Leverage these relationships online and off, by recommending each other via links and distributing each other’s business cards.
30. Launch an affiliate program. Most of the links you pick up will not have SEO value, but the added exposure will almost always lead to additional “normal” links.

Easy Free Links
31. Depending on your category and offer, you will find Craigslist to be a cheap or free classified service.
32. It is pretty easy to ask or answer questions on Yahoo! Answers and provide links to relevant resources.
33. It is pretty easy to ask or answer questions on Google Groups and provide links to relevant resources.
34. If you run a fairly reputable company, create a page about it in the Wikipedia or in topic specific wikis. If it is hard to list your site directly, try to add links to other pages that link to your site.
35. It takes about 15 minutes to set up a topical Squidoo page, which you can use to look like an industry expert. Link to expert documents and popular useful tools in your fields, and also create a link back to your site.
36. Submit a story to Digg   that links to an article on your site. You can also submit other content and have some of its link authority flow back to your profile page.
37. If you publish an RSS feed and your content is useful and regularly updated, some people will syndicate your RSS content (and some of those will provide links… unfortunately, some will not).
38. Most forums allow members to leave signature links or personal profile links. If you make quality contributions some people will follow these links and potentially read your site, link at your site, and/or buy your products.

Have a Big Heart for Reviews
39. Most brands are not well established online, so if your site has much authority, your review related content often ranks well.
40. Review relevant products on Amazon.com . We have seen this draw in direct customer enquiries and secondary links.
41. Create product lists on   Amazon.com that review top products and also mention your background (LINK!).
42. Review related sites on Alexa to draw in related traffic streams.
43. Review products and services on shopping search engines like ePinions to help build your authority.
44. If you buy a product or service you really like and are good at leaving testimonials, many of those turn into links. Two testimonial writing tips — make them believable, and be specific where possible.

Blogs & the Blogosphere
45. Start a blog. Not just for the sake of having one. Post regularly and post great content. Good execution is what gets the links.
46. Link to other blogs from your blog. Outbound links are one of the cheapest forms of marketing available. Many bloggers also track who is linking to them or where their traffic comes from, so linking to them is an easy way to get noticed by some of them.
47. Comment on other blogs. Most of these comments will not provide much direct search engine value, but if your comments are useful, insightful, and relevant they can drive direct traffic. They also help make the other bloggers become aware of you, and they may start reading your blog and/or linking to it.
48. Technorati tag pages rank well in Yahoo! and MSN, and to a lesser extent in Google. Even if your blog is fairly new you can have your posts featured on the Technorati   tag pages by tagging your posts with relevant tags.
49. If you create a blog make sure you list it in a few of the best blog directories .
50. Start all over again.

Finding/Identifying ‘buzz’ words (like Dove’s Pentapeptides ) and How to Dominate Search and Turn New Words into Huge Traffic Sources
What are pentapeptides?
http://www.youtube.com/watch?v=nEyJK3JVB50
I’ve seen this   commercial   no less than 20-30 times   in just the last couple weeks. I’d like to think I was fairly intelligent, at least to the point that I would have heard this word before, but I haven’t. A search on Wikipedia turns up absolutely nothing. My brand new version of Microsoft Office (Word) 2007 doesn’t have it in its dictionary either. Google only shows 135,000 results. Of the results that Google is showing, the top 3 have either a 2 or 3/10 PageRank , and only a few have back links. The #1 result for pentapeptides has a 2/10 PageRank and only 2 Google backlinks showing.

The point I’m trying to make here is that new words are invented every day, whether by scientists naming a drug, a car company naming a new model, or a company creating a new product. If you put the tools in place to monitor for these you can use them to corner a new market.

My first thought here if I were an online marketer would be to find a product that you can remarket for a commission like through an affiliate site. I could create a page within my affiliate website for pentapeptides. The homepage holds a PageRank of 5/10 so that page would soon hold a 4/10 and immediately be front page, and eventually with a little social bookmarking and link building it will dominate the SERP’s, as well as be concreted in the top positions. Imagine what would happen if Pentapeptides takes off!

Better yet, if I had a current site or a page that was ranking and had at least one back link showing in a ‘link:yoursite.com search on Google’, I would go in and integrate ‘pentapeptide’ into the content according to the checklist in Tip #21.

If your one of the lucky ones reading this first, this is a real-life example that you can actually go out and implement what I’ve said and actually make it happen!

I use Google Alerts to find potentially new ‘niche’ phrases related to one of the sectors that I market in. It is fairly vast considering we cover financials, insurance, casino, bingo, travel, airlines, mobile phones, cars, furniture and bedding, clothing and many more.

Watching the mobile phone industry is taking the above and turning it up a few notches because I want to actually have our people contact them and find out what the ‘next hot phone’ is and what its called. Of course industry experts are already utilizing this technique.

This goes for paid search as well. There are currently only 2 Adwords advertisers in the search for pentapeptides and one is eBay.

Bump Your Competitors Multiple Listings Out of Google and Pick up a Position or Two

Every wonder why during a search   you find a competitor that has two pages listed above you? I call them kicker listings. The home page is always the second listing, and the first is an internal page that actually has relevant content.

Here is why this happens. When you submit a query Google looks at its rank and if they are close to each other in their results, they group them together. If you are showing up in the SERP’s first couple pages then it is most likely that you are listed again much deeper in the results. But when two pages are close, like top ten, or top 20, then Google shows them side-by-side. The second, usually the index page, will be listed below and also indented.

By going into ‘advanced search’ the number of default result can be changed, or you can add this bit of code to the end of the url string that it shows after a search for your keyword, just after the search? And the results will be more refined. Add this ‘num=8&’ to the end of the url. This number may change the results, but if not reduce the number. This will show you where your competitor’s second page should actually be .

Okay, so now should go back to the original search that showed the double listing. Within the search results look where your competitor is showing up, then look below his listings for a non-competitor. It could be anything, a video, a news story or a Wikipedia or eBay listing. Use the guide in Tip #11 to do some social bookmarking, or even link to the page from your website (preferably on a second level subdirectory).

What this will do is add a little boost to the non-competitive website and bump the ‘kicker’ listing that your competitor has, back to where he belongs, below your listing. This is surprisingly easy and quick using a combination of bookmarks and back links. It may even boost your trust rating with Google by having an outbound link to a high ranking website. Using this method on eBay sometimes provides a double-boost because if it is an auction rather than a store item it may drop off the SERP’s once the auction is over.

Automate XML Sitemaps – In the past you had to create several version of your sitemap for the different search engine bots. They required these to properly crawl your website’s content for indexing (inclusion in their results)

Since then, two major changes have been made.

    1. A universal sitemap format was adopted: xml (this even includes Ask.com)

    2. A tweak was added that tells the bots to go to your robots.txt file first and look for a path to the xml file so that it knows where to go, and additional features that allow you to prevent the bots from crawling and indexing unnecessary files such as cpanel, administration or even image files.

You can specify the location of the Sitemap using a robots.txt file by simply adding the following line:

Sitemap: <sitemap_location>

The <sitemap_location> should be the complete URL to the Sitemap, such as: http://www.example.com/sitemap.xml

This directive is independent of the user-agent line, so it doesn’t matter where you place it in your file. If you have a Sitemap index file, you can include the location of just that file. You don’t need to list each individual Sitemap listed in the index file.

There are lots of places that have a free xml sitemap generator;
SourceForge.net
xml-sitemaps.com
auditmypc.com

I think GSoft also has an Open Source program that will automatically create the xml sitemap and upload it via your ftp if you set it up.

Because of the ever-changing content of a properly optimised site, as well as sites with CMS’s (content management systems) and the millions of static sites out there, this is the method that I recommend.

Additionally, fresh content will keep the robots coming back to index your site. Most of these programs will insert the creation date in to the file to document that it is a revised version, but the most important part is that whenever you change anything within the site (and based on this article you may have a few changes to make) this will assure you that it will be picked up by the engines automatically without having to spend the time that we did in the past to expedite this process.

Part of the problem is that if you add navigation to this new content and put it in the site template, as I mentioned earlier, search engines will parse (remember) the content and skip over it to preserve its allotment of data that it can crawl on each url. They want to crawl deep and get as much content as possible, so they skip pages that provide no new content. (This is why multiple Rss feeds are important as mentioned in Tip #13)

Finding What Terms Are Converting Into Sales/Tracking Keywords to Conversion With Weighting

Having 100,000 unique visitors a day really doesn’t matter in the end if you aren’t getting any conversions (new members, info requests,   sales).

Measuring successes and failures for landing pages, on-page content like CTA’s, and especially keyword to sale are some of the most important pieces of information that you can gather and use to improve and optimise your overall website.

Here are two scenarios to better illustrate this point;

  1. Paid Advertising – A car insurance company starts a paid advertising campaign on Google and after a week or so they see that the name of their company or their ‘brand’ seems to be converting the majority of their sales. Because of this discovery, they target the majority of their budget on their brand terms like ABC Insurance and ABC Insurance Company.

A week later they see that their CPA (cost per acquisition) has sky-rocketed almost two-fold and can’t figure out why this is. When they look at Google analytics and other third-party tracking software, they both say the same thing.

So why is this?

Let’s take a look at the buying process (also called funnel tracking) to see where they went wrong; Mrs.INeedInsurance hopped online while enjoying her morning java to look for insurance because last night when Mr.INeedInsurance opened his renewal notice he got a significant premium hike. At dinner they decided to start shopping around for insurance. Mrs.INeedInsurance searched ‘ car insurance’ between 6-8am that day, going in and out of different companies websites, learning what she was up against…tens of 1000’s of results. So at work (11a-2pm is the #1 time people shop online – not necessarily making purchases) Mrs.INeedInsurance has learned a bit about search and decides to add her city in the query. This time she searches ‘car insurance London’ , and still gets several thousand results, but at least they are localised, and there are a few that she recognizes from this morning so she goes in and fills a few of the forms out to get quotes. Throughout the rest of the day she gets the quotes either immediately from the website or via email. Now she’s getting somewhere. Jump forward to after dinner that evening. Mr.INeedInsurance looks through the notes his wife brought home and decides that ABC Insurance offers the best deal for the money, then goes to Google and searches for ABC Insurance and makes the purchase.

See what happened here? I use this as an example because this is exactly what I identified for a client a few years back that inevitably led to changes that doubled their conversions.

The problem is that all the data pointed to ABC Insurance’s brand name as being the top converting term, so that’s where they concentrated the bulk of their budget. In actuality, ‘car insurance’ and then ‘car insurance London’ were the terms that actually led up to the sale.

The reason that this is important for PPC campaigns, or any paid advertising, is that many will allow you to do keyword weighting. This is where you increase your bids or decrease your bids by a percentage according to day parting. Day parting is turning your ads up or down according to the time table that you put in place.

In this instance I would turn my bids up to 125% on ‘car insurance’ and ‘car insurance London’ in the morning and afternoon, then down at night. On ‘ABC Insurance’ I would turn the bids down in the morning to 50%, and then back up to 125% in the evening.

Keyword weighting also allows you to weight your keywords and track them to conversion. It places a cookie on the end-users computer to track what keyword brought them to the sight, what keyword resulted in a quote, and what keyword resulted in a sale.

This is beneficial because I can further adjust my bidding strategies according to demographics and geographical metrics.

With these cookies I can also successfully measure and establish LTV (Lifetime Values) of the average customer. This allows me to adjust the conversion value, which allows me to go back to my company/client and potentially get a higher advertising budget.

Using this same insurance company as an example; initially they gave me a conversion value of $25. Now, since we were able to identify other sales made by this customer, the conversion value is $40.

Offline this company spends 100,000 on advertising through different venues, acquiring customers at a cost average of £/$56. Guess what happened the next month? They increased the budget by 100,000.

Organic Advertising – Same scenario as above, except ABC Insurance Company identifies through log files or Google Analytics that his top converting keyword that is getting sales is car insurance.

In light of this, the decision maker decides to create a landing page that is fully optimised so that the relevancy grade that all 3 search engines use will increase their organic positions, which it will.

The problem here is that the term that was actually bringing them to the website to buy was ‘cheap car insurance’. If they had identified this they could have built the page around the term, ‘cheap car insurance’ rather than just ‘car insurance’. This would have served double-duty and acted as a great landing page for both keyword phrases.

This is why tracking your keywords to conversion is so important. It can save thousands on paid advertising and identify the actual keyword phrases that need pages built around for improving organic rankings.

If you are experiencing a high bounce rate or what you feel is high cart abandonment, you might be surprised to find that many didn’t buy elsewhere; they actually came back to you and bought.

This is also helpful in refining your stats. Rather than show this customer as 3 separate visitors, it identifies (through the cookies) that they were actually just one visitor, and the bounce rate or cart abandonment is significantly reduced.

This information can very invaluable as well.

For instance, maybe I was getting high unique cart abandonment from unique users that was significantly higher once they went to checkout. I know that happens when I add shipping costs into the total. So I might try to do some A/B testing with and without shipping costs listed separately, added into the price initially and adding it during checkout and see which converts better. Or I may set the website up to recognize the cookie and create a drop down that offers free shipping today with any purchase over $/£XX.XX.

There are endless possibilities to use this information for.

WebTrends
DirectTracks
BidBuddy

Supplemental Results – What They Are, How to Find Them and How to Get Out of Them

Supplemental sites are part of Google’s auxiliary index. Google is able to place fewer restraints on sites that we crawl for this supplemental index than they do on sites that are crawled for the main index. For example, the number of parameters in a URL might exclude a site from being crawled for inclusion in the main index; however, it could still be crawled and added to Google’s supplemental index.

The index in which a site is included is completely automated; there’s no way for you to select or change the index in which your site appears. Please be assured that the index in which a site is included does not affect its PageRank.”

Nonsense!

At the time of this article Google was already starting to eliminate their search results showing supplemental results. Until recently, all you had to do was go to the last few pages of your query and locate the pages that had ‘ – Supplemental Result’ just after the page size. They aren’t showing these anymore. Here’s what they had to say,

Since 2006, we’ve completely overhauled the system that crawls and indexes supplemental results. The current system provides deeper and more continuous indexing. Additionally, we are indexing URLs with more parameters and are continuing to place fewer restrictions on the sites we crawl. As a result, Supplemental Results are fresher and more comprehensive than ever. We’re also working towards showing more Supplemental Results by ensuring that every query is able to search the supplemental index, and expect to roll this out over the course of the summer.

The distinction between the main and the supplemental index is therefore continuing to narrow. Given all the progress that we’ve been able to make so far, and thinking ahead to future improvements, we’ve decided to stop labeling these URLs as "Supplemental Results." Of course, you will continue to benefit from Google’s supplemental index being deeper and fresher.”

Google then said that the easiest way to identify these pages is like this; First, get a list of all of your pages. Next, go to the webmaster console [Google Webmaster Central] and export a list of all of your links. Make sure that you get both external and internal links, and concatenate the files.

Now, compare your list of all your pages with your list of internal and external backlinks. If you know a page exists, but you don’t see that page in the list of site with backlinks, that deserves investigation. Pages with very few backlinks (either from other sites or internally) are also worth checking out.”

Nonsense!

The easiest way to identify your supplemental pages is by entering this query ‘site:www.yoursite.com/&’

Okay so now you have identified the pages that are in supplemental results and not showing up in the results anywhere.

Now we need to identify why they are there. The main reasons that a page goes to supplemental results are;

  1. Duplicate Content

  2. 301’s. Redirected Pages that have a cache date prior to the 301 being put in place

  3. A 404 was returned when Google attempted to crawl it

  4. New Page

  5. Bad Coding

  6. Page Hasn’t Been Updated in Awhile

  7. Pages That Have Lost Their Back Links

  8. And according to Matt Cutts of Google,“PageRank is the primary focus determining whether a URL is in the main web index vs. supplemental results”

Now this isn’t the end-all, but it covers about 95% of the reason that you may be in the supplementals.

So now we know what they are, how to find them and why they are most likely in the supplemental results. Now let’s get them out of there.

Here are the different methods that I use when I find that a page has gone supplemental;

  1. Add fresh content to the page

  2. Add navigation to the page from the main page

  3. Move the pages to the first subdirectory if it is not already there

  4. Get a back link to the page and/or create a link from an existing internal page with the anchor text containing the keywords for that page

  5. Do some social bookmarking on the page

  6. Make sure the page is included in my xml sitemap and then resubmit it to Webmaster Central.

  7. Lastly, if none of the above seem to be working after 90 days, and I have another page that is relevant and does have PageRank and isn’t listed in the supplemental, I do a 301 (permanent redirect) to it from the supplemental page.

    My Top SEM Tools

This is a comprehensive list of tools that my team and I use daily at   Stickyeyes .

The first one, Firefox with the three extensions, are the tools that I spoke about at SES London 2007 . I encourage you to install it and check it out. Its one of the better tools I have used.

I use all three of these applications; IBP, SEOElite and WebCEO . Each has it own best tool and each are indispensible.

Firefox SEO Add-Ons (1) , (2) & (3) (Density, links, code cleaner, W3C Compliance, etc.)
Google Analytics - Provides deep analysis on all traffic, including paid search.
IBP - Several tools for checking rank positions, basic SEO page analysation and link building tools
SEO Elite – Excellent for Link Building, analysis, finding where competitors are advertising
WebCEO - Site optimization, promotion and analysis.
Back Linking/Bookmark Tools
Bookmark Demon and BlobCommentDemon – Automates the process of Bookmarking and Posting to Blogs
Link Building 101 - Basic Link Building Instructions and Tips.
Link Baiting - Good Link Baiting Tutorial
Google Webmaster Central – Statistics, diagnostics and management of Google’s crawling and indexing of your website, including Sitemap submission and reporting.
Comprehensive Link Building 101
Link Baiting *Instruction
Pay Per Click Tools
Keyword Elite – I use this within my arsenal of keyword tools
Wordtracker -Data is based on the Metacrawler and Overture search engines.
KeywordDiscovery - Data is based on the number of search engines.
Keyword Optimizer - Enter a list of keywords and this tool will remove any duplicate entries and re-order the list alphabetically.
Google Analytics - Provides deep analysis on all traffic, including paid search.
Google Suggest - As you type, Google provides the top 10 most popular keywords that begin with the keyed-in letters, in order of popularity.
SpyFu - Find out what competitors are biding on and estimates for the cost of PPC advertising and others bells and whistles.
Hittail – Finds and easily groups the actual terms being used to find your site into an excel format. Great for finding niches and long keyword strings.
Google Trends - Graphs historical trends of various Google searches.
Google Keyword Tool External -Historical trends in keyword popularity.
BidCenter - A good tool for comparative analysis and easy to use
SEO Sleuth - Find what AOL users search for (AOL produces 2x the retail conversions as any other engine)
ROI Calculator - This calculator measures the ROI (return on investment) of a CPC (cost per click) campaign.
Adwords Wrapper – Concatenates multiple words into a usable format in Adwords
PPC Hijacking *Information
PPC 101 *Instruction
PPC 102 *Instruction

Site Tools

Virtual Webmaster – This is a great tool for the ‘Do-It-Yourself” type.   200 Web Developers Will Complete Any Website Change Request in 48 Hours
C-Class Checker - Use the Class C Checker if you own several cross-linked sites. If you do, it may be more efficient (for SEO purposes) to host them on different Class C IP ranges.
Code to Text Ratio - This tool will help you discover the percentage of text in a web page (as compared to the combined text and code).
Future PageRank - This tool will query Google’s various data centers to check for any changes in PageRank values for a given URL.
Internet Officer - Checks for Redirects
Live PR - The Live PageRank calculator gives you the current PageRank value in the Google index, not just the snapshot that is displayed in the toolbar.
Keyword Cloud
- This tool provides a visual representation of keywords used on a website.
Keyword Difficulty Check - Use the Keyword Difficulty Check Tool to see how difficult it would be to rank for specific keywords or keyword phrases.
Page Size - This tool will help you to determine HTML web page size.
Site Link Analyzer - This tool will analyze a given web page and return a table of data containing columns of outbound links and their associated anchor text.
Link Analysis - Find out about all links on a page, including hidden ones and nofollow-markers
Spider Simulator - This tool simulates a search engine spider by displaying the contents of a web page in exactly the way the spider would see it.
URL Rewriting - This tool converts dynamic URLs to static URLs. You will need to create an .htaccess file to use this tool.
Keyword Misspelling Generator - allows you to generate various misspellings of a keyword or phrase to match common typing errors. Useful for creating keyword lists around your most important keywords to bid on.
Keyword Density Analysis Tool - finds common words and phrases on your site.
Hub Finder - finds topically related pages by looking at link co-citation. post about tool
Page Text Rank Checker - tool allows you to check where your site ranks for each phrase or term occurring on the page.
XML Sitemaps - makes XML sitemaps for sites.
PageRank Toolbar For Mac - A widget to show PageRank for the site you are on.
Xenu Link Sleuth – Use to find broken links. Supports SSL sites and also reports on redirects.
Mobile Readiness Report – See how well your site is formatted for mobile phones. Includes Visualisation.
Javascript Content Hiding – Hide content on your site from search engines and other crawler/bots

Google Tools
Google Webmaster Central *Tool
Google Labs *Tool
Check For Google Supplemental Results
SpyFu for Google Bidding
Google Future PR
Google Sandbox
Google Dance Watch
Google Page Rank Formula and Sandbox Explanation
Google Google Information and FAQ
Google Reinclusion Request  
Banned by Google ?
Google Advanced Search
Google Data Center Pages Indexed Check
Google Page Rank Check (All DC’s)
Google Keyword Ranking Check
Google "Need-To-Know" Info
Beginner Adwords Tips
How Google Analytics Work
Check Google Keyword Prices
Hit Tail *Advanced Adwords
Hit Tail Documented
Fake Page Rank Detection Tool
Adwords Click Fraud Study *Information

And Even More SEO Tools

Getting into DMOZ    

Meta Tag Generator  
RoboForm – A MUST-HAVE!  
Keyword Density

Redirect Checker

Robots.txt Generator  

Link Popularity  

Domain Age Check  

Code-to-Text  

Spider Simulator  

Who Supplies Who with Search Results

Abuse IP Checker Tool

IP Information Tool

IP, City and reverse IP Lookup *Tool

Ping Tool

Traceroute Tool


Other Useful Tools

Data Recovery Software – Powerful Data Recovery with ‘on-the-fly’ viewing

Create MultiMedia Pdf EBooks – Great for eTailers. This creates search engine optimised customer-facing Pdf documents as mentioned in Tip #5

Streaming Video For Your Website - Add Streaming Audio Or Video To Your Website Easily And Quickly

Add YouTube or Arcade Scripts - Entertainment Scripts Which Allow You To Start Your Own YouTube, MySpace, Break Or Arcade Website.

WordPress Auto Content Generator – Auto Generates Fresh Content for Your Blog

Next Generation RSS (SEO) Software – Add Rss Feeds To Your Site Easily

Bookmark Demon and BlobCommentDemon – Automates the process of Bookmarking and Posting to Blogs
Aaron Walls SEO Book – Yea I know, why am I listing this…I guess because it’s a great resource that provides a few thing I haven’t listed here.
Traffic Travis – Another Unique SEO Tool with its Own Unique Merits

Half Again.com – Content, Blog and Rss Generators


SEO Checklist – I am currently having this checklist developed in an automated tool. Email me re: Automated SEO Checklist When Available if you would like a beta version when I get it.

This checklist will take care of approximately 75% of your SEO.

SEO Checklist


KW:


Page: http://




Tool or Task

Metatags and on-page optimisation

http://www.seochat.com/seo-tools/meta-analyzer/

Are the keywords in the title with a 1-word buffer (Max – 1 keyword phrase)

http://www.seochat.com/seo-tools/meta-analyzer/

Are Keywords in META keywords. It’s not necessary for Google,


but a good habit. Keep the META keywords short (128 characters max, or 10).

http://www.seochat.com/seo-tools/meta-analyzer/

Are Keywords in META description. Keep keyword close to the left but in

a full sentence.


Check Content or

Are Keywords in the top portion of the page in first sentence of first full


bodied paragraph (plain text: no bold, no italic, no style).

Check Content

Are Keywords in an H2-H4 heading

Check Content

Are Keywords in bold – second paragraph if possible and anywhere but

the first usage on page.


Check Content

Are Keywords in italic – no more than once

Check Content

Are Keywords in subscript/superscript – no more than once

Check Content

Are Keywords in URL (directory name, filename, or domain name).


Do not duplicate the keyword in the URL.

Check Code

Are Keywords in an image filename used on the page.

Check Content

Are Keywords in ALT tag of that previous image mentioned.

Check Content

Are Keywords in the title attribute of that image.

Check Content

Are Keywords in an internal link’s text.

Check Code

Are Keywords in title attribute of all links targeted in and out of page.

Check Code

Are Keywords in the filename of your external CSS (Cascading Style


Sheet) or JavaScript file.

Check Content

Are Keywords in an inbound link on site (preferably from your home page).

Check Content

Are Keywords in an inbound link from a related offsite (if possible).

Check Content

Are Keywords in a link to a site that has a PageRank of 8 or better (e.g. .gov or .edu)

Check Code

Are Keywords in an html comment tag? <!– keyword –>



Tool

Technical

IBP

What is the code-to-text ratio? (text should be at minimum a higher percentage than the code)

IBP

How many links are pointing to the full url (w/http://)

IBP

How many links are pointing to the domain?

IBP

Have you associated the http and the http://www versions of your site with Google?

IBP

What is the Domain name visibility? A count of results at Google


for a search for the domain, showing URL visibility rather than incoming link count.

IBP

Number of internal pages that link to the home page?

IBP

Number of Technorati links?

IBP

Number of del.icio.us links?

IBP

What is the page size? Should be under about 40k

IBP

How long does it take to load the page? Should be under 1.3 sec on a 56k connection

IBP

On each page, is the top keyword density on each page between 3-7%?

http://www.internetofficer.com/redirect-check.html

Do you have any redirects? Using 302 redirects is one way that Google is identifying potential SPAM sites and have specifically said to use a 301, NOT a 302

http://validator.w3.org/detailed.html

Is the page W3C Compliant?

http://www.copyscape.com/

Is their any duplicate content out on the web? You shouldn’t be above 40% for any of your pages

http://www.123promotion.co.uk/directory/

Is the site in the top 10 directories?

http://www.seochat.com/seo-tools/spider-simulator/

Is a spider seeing all of the site content?


Does each page have titles that are not dynamically generated? Maximum 1 (one) instance of the ‘=’ symbol

IBP

Is there javascript within the content? Move it off


Other Issues

Check Content

Are there at least 250 words in the content?

Check Code

Is your Javascript in external files and named with your keywords?

Check Code

Alternative navigation on flash or frames?

Check Content

Xml and html sitemap?

Xenu (Download)

Are their any broken links?

Check Code

Is there a robots.txt file?

Check Code

Do you have a path to the xml sitemap in the robots.txt file?

http://www.netmechanic.com/toolbox/power_user.htm

Browser Compatibility (IE, Netscape, Opera, Firefox, Mosaic and Safari)


Linking

SEO Elite

# of Google backlinks?

SEO Elite

# of MSN backlinks?

SEO Elite

# of Yahoo backlinks?

SEO Elite

DMOZ listing?

Check Site

Does the site have outward rss feeds option?

Check Site

Does the page have rss feeds for fresh on-page content

in pages other than the index page?


Check Site

Does the site have an SEO optimised 404 page?

Search Google for site: and .pdf

PDF optimised docs in root file with a navigation page listing

each doc description and link. Also a separate xml sitemap for these and separate submission.


http://home.snafu.de/tilman/xenulink.html

302 redirects? (Change to 301 – Google will penalise you

for these if you leave them up too long)




  

About the Author

Gary R. Beal is originally from the United States. Now living in the UK, he travels to conferences all over the world.

Gary has “crossed the pond” to close the gap between the US and Europe in online marketing training many U.S. based Search Managers at top agencies, companies and conferences. In 2007 Gary spoke at the SES conference in London, and Gaming and Affiliate conferences around Europe.

Gary is the Director of Search at Stickyeyes – one of the UK’s leading internet marketing agencies with a client portfolio that includes major corporations such as MTV, Jaguar, 02, Jet2, Littlewoods Bingo, Mecca Bingo, First Direct, Lloyds TSB and many others.

Gary attended Ohio State University in the U.S. and holds a Masters Degree in Biometrics and Mathematical Statistics. He has been instrumental in the development of many Search Engine Optimisation and Pay Per Click tools as an analyst and consultant.

He is well known in most of the top SEO/SEM/PPC forums, a staff writer for DevShed and SEOChat, and a Moderator at SEO Chat. He has worked for many years in lead aggregation for highly competitive industries such as Online Gaming, Banking and Finance, Insurance, Travel and Investments and can effectively speak about doing business in these industries, as well as successfully doing business on the internet.

Google+ Comments

Google+ Comments