Website Re-launch and Maintaining Search Rankings

Search Engine OptimizationAfter re-launching a website for a client, the number of visitors from search engines dries up faster than Michael Jackson’s fan base. I have seen this a number of times as an SEO consultant. In fact, it happened to me. This article is written for those interested in re-launching a website while minimizing the effects on their search engine rankings. It assumes you have a basic understanding of SEO practices. So, put on your propeller hats and let’s get down to it.

 

There are a number of SEO tips you can employ when re-launching a new website; however, the tips you employ will depend on which website structure you decide to use:

 

1. The new website will be based on the old website’s structure (e.g.: same directories, same page names, the website’s leftovers).

 

2. The new website will have a new structure (e.g.: new directories, new page names, the whole shebang). 

 

These are the two situations I’ll be speaking about in this article. In addition, when it comes time to re-launch, I also have a few tips to offer.

 

Let’s start our propellers!

 

 

If you’re working with the leftovers because the old website was successful, congratulations and take warning. Chances are the last webhead did a great job in her optimization efforts. You will be able to reuse a lot of the old content. After the re-launch you’ll be credited for a successful transition and given a hearty pat-on-the-back. You don’t want to mess up by redesigning the website and flushing all your traffic down the toilet. Your SEO goal in the redesign: maintain or better the search rankings.

 

I have worked on a number of websites which had great search engine traffic, but looked like a design from the IT Department–need I say more? Everybody in the company knew it was time for a visual overhaul (except the IT people, of course), but those in charge were afraid that a redesign would torpedo their search engine traffic.

 

Let’s remember that search engine results are largely based on header information, content, website structure, and inbound links. Search engine results have nothing to do with great and not-so-great web design. Simply put, if we maintain these key factors, we’ll maintain our search results.  The easiest way to maintain your search rankings is to copy and paste the title tag information, meta-information, and content into the new web pages. It’s never as easy as copy and paste, but if you can, do it. Text, headers, like <h1> tags, can always be formatted to match the new design, plus images can be added without impacting the Keyword Density (KW) of the winning web page.

 

Can you change anything on the web page? Well, I usually do. I can always find ways to better a web page, but that’s me. If you’re not confident that you can further optimize the page, then don’t. In fact, it’s a good strategy to put all your old information into a new web design, then upon its re-indexing, check to see how the website’s rankings were affected. If you’ve maintained your rankings, then you can breathe a sigh of relief. Those changes you thought you could add to better your rankings can now be added for the next indexing.

 

Now, the last scenario assumes that every page will be copied over to the new design. This scenario is pretty unlikely. Chances are you have a solid homepage, products and/or services page which you can apply the above advice to. But what about those sub-pages that never got indexed?

 

 - Are the pages behind graphic links, Flash animations, dHTML menu systems, long query-strings?

 

 - Is there a text link means to find those web pages that are more than 3 clicks from the homepage?

 

If you answered yes to either of these questions, then you have some rethinking to do. On the bright side, if you’ve ensured nobody is linked to these web pages you can rename these web pages to be targeted keywords (i.e.: ski-coats.htm, cell-phones.htm) instead of, say, sc.htm or cp.htm. Ensure nobody has linked to these pages before you change the names.

 

If you’re using graphic links, Flash navigation, DHTML menu systems, query-strings, or the web pages are deep in the site, you’ll need to rethink your navigation. As you probably know, search engine spiders cannot follow the links in these types of situations. Rework the navigation to be text-based or create a sitemap.

 

 

A sitemap is an excellent way to expose all websites’ deep pages to search engine spiders. Place the sitemap’s text link high within the new website’s design so even truncated spider searches will always have the opportunity to follow the link.

 

The sitemap page should be constructed of purely text links leading to all those nooks and crannies of your website. In the future, I will write an article on creating effective search engine friendly sitemaps. For now, use the KISSES method: Keep It Simple So Engines Spider. I just made that up, but it works.

 

By keeping the sitemap link on each web page, high in the design, text-based, and the sitemap easy to index, you solve all those reasons for not having pages indexed.

 

 

If you’re fortunate enough to be doing the whole shebang, congratulations and take warning: chances are you’re going to look like a god after the re-launch. Your boss will be buying you lunch and children will be named after you. Your work on the redesign will be much more intensive, but the reward of children named after you is greater than a pat-on-the-back as well. Your SEO goal in the redesign: design a website structure that will enhance your website’s search engine rankings; and smooth the transition for the search engine spiders.

 

Sometimes it’s necessary to start anew. As a matter of fact, the majority of upgraded websites have to be reworked to some degree. This has been my experience; most organizations want enhanced features on their websites, new product lines, and more company information. This usually necessitates increasing the number of directories and pages on the website, but don’t trash the old website just yet.

 

The old website usually has some structural features that can be retained. For example, directories which can work within the new website structure should remain. As well, the default directory documents should remain. That is, if the old website uses index.htm as your default document then you should attempt to retain that standard. This last point is not crucial, especially if the old website was a dog with little or no traffic. Moreover, if you’re moving from a static website to a dynamic website, then the default document names are going to have to change. Don’t fuss about this too much. Bottom line: if the search engines have your default documents within their caches, and those pages come up in search results, you don’t want to create 404 errors by changing their names and not having a plan. We’ll talk about a plan in a few minutes.

 

 

There are no hard-and-fast rules about setting up the underlying directories when it comes to website building. I have seen a handful of directory structures which obtain high search rankings. So, the prime considerations when site mapping should be that it’s logical for the organization you’re building it for as well as organized. I’m sure some Information Architects are pulling out what little hair they have when they read my one sentence summation of site mapping, so I apologize to them. To paraphrase: There are no reasons why SEO practices and proper website structure cannot work alongside each other.

 

When it comes to naming the new directories and web pages I have a few tips that can have a positive effect on your search engine rankings:

- Directories can be keywords (i.e.: /computer-training-services/, /dog-food-products/, etc.)

- Pages can be keywords (i.e.: windows-training.htm, dog-supplies.asp, etc

As you can see, I use hyphens, not underscores, to separate keywords. Using this standard is the advice GoogleGuy made back in September 2003. He stated that “Google does not treat underscores as word separators.”

 

 

We certainly don’t want to create 404s by renaming all the old pages to new ones. There are cases when you’re going to want to retain remnants of the old site, but you have to change the web page’s name. An example of this is when the website moves from a static website to one driven by an application language like ASP, PHP, or CFM. What do you do then?

 

There are two strategies you can use:

 

Strategy 1 – Retain the web page under its old name, but give it the new design. Do not create duplicate content on your website by copying and pasting this content into the new website. Duplicate content is a mistake that can cost your website its rankings. Search engines frown upon duplicate content. Instead, create new content for the new website which accomplishes the same goals as the old content. In effect, you have two pages, with similar content.

 

Strategy 2 – Implement 301 Permanent Redirects on the old pages to redirect to the new pages. This is the preferred method. The 301 Permanent Redirect informs search engine spiders that the web page previously at this location has been ‘Permanently’ moved. In the case of Google, all PageRank will be moved over to the new page. In time the old web page will fade from the search engine indexes and be replaced by the new web page.

 

I have a few 301 scripts that can be pasted into the top of old web pages. Of course, these will only work if the web page is using the appropriate application language:

 

For Coldfusion:

 

<cfheader statuscode=”301″ statustext=”Permanent Redirect”>

<cfheader name=”Location” value=”http://www.yourwebsite.com/newdirectory/newpage.htm”>

 

For ASP:

 

<%

Response.Status = “301 Moved Permanently”

Response.addheader “Location”, “http://www.yourwebsite.com/newdirectory/newpage.htm”

Response.End

%>

 

For PHP:

 

<?php
header(“HTTP/1.1 301″);
header(“Location:
http://www.yourwebsite.com/newdirectory/newpage.htm“);
?>

 

Apache server, you can add 301s to the the .htaccess file:

 

Redirect 301 / http://www.newsite.com/
Redirect 301 /olddirectory/ http://www.newsite.com/newdirectory/
Redirect 301 /olddirectory/oldpage.html http://www.newsite.com/newdirectory/newpage.html

Once you have implemented the redirects, you should check the headers to ensure they are sending 301 Permanent Redirects: http://www.webmaster.bham.ac.uk/headers/.

Unfortunately, if you have a completely static web page on an IIS server you only have one option I am aware of. Your web host can set up the IIS server to do a server-side 301 redirect. The hitch is that the 301s are only at the domain level (e.g.: http:www.olddomain.com –> http://www.newdomain.com). This is not too helpful when you just want to change your web page’s name. Your options for static pages are:

1. Use Strategy 1; or,

2. Place a notice on the old web page that this page has moved to another location. Place a text link on the web page so the visitor and spider can find its way to the new location.

Whether you’re using Strategy 1 or 2, once the new web pages are indexed and are doing well, it’s time to work on flushing the old web pages from the search engines’ indexes. To do this you can paste the following into the head of the old web pages:

<meta name=”robots” content=”noindex, nofollow”>

This informs the robot not to index this web page any longer. In time, the page should fade from the search results. Keep an eye on your logs to determine when these pages are no longer accessed and it’s safe to archive them permanently.

 

 

If all the inbound links are attached to the domain name (e.g.: http://www.domainname.com/) then you don’t have to worry. Your change to a new website will have no impact on your inbound links. It’s always a good practice to make your links to the domain or directory and not to the default documents.

If, however, you have inbound links attached to web pages that have to have their names changed (e.g.: index.htm –> index.asp), then, after the re-launch, you’ll need to contact those linking to you and get them to update their links.

Managing inbound links is a pain in the… neck.

 

 

You are going to find 404 errors in your logs after the re-launch. It’s the Internet after all and you just never know how the heck people find your website. The best way to manage those visitors that happen to request a deleted page is through a custom 404 errors page. Implementing one is as simple as creating a web page and having your web host configure your website to use this page when a ‘Page Not Found’ error occurs.

 

The format of the 404 web page should be that of a sitemap. I don’t like to alarm the visitor that they just hit an error so my message is similar to that on the sitemap, but with a short line about the page not being found. With a well laid out sitemap, the visitor should be able to locate the section of the website they were looking for without too much trouble.

 

 

So, the redesign is complete. You’re ready for the re-launch and the accolades that will follow. So when is the best time? Some might answer that it’s fine to re-launch whenever your redesign is complete, but then you’d risk exposing your half-finished website to spiders and damaging your rankings.

From my experience and research I’ve decided the best time is ten days after Googlebot’s next re-indexing. If, for example, Googlebot happens across your website on the 15th of the month, then wait until the 25th. The wait takes into consideration that Googlebot sometimes returns after the initial re-indexing to continue its work. After the ten day wait, you will then have approximately 10 to 14 days to complete your re-launch in preparation for Googlebot’s next return.

Well, that’s it. You can turn off you propellers now. Good luck on your redesign and let me know if any children are named after you.

 

Google+ Comments

Google+ Comments