Web sites are becoming more interactive. The second Internet boom is on, fueled by new technologies, new concepts and radically new business models. "Web 2.0" was the name coined by Tim O’Reilly to describe these new web sites that have redefined the way users locate and interact with data, and has also brought a new kind of publisher.
How will advertisers and designers handle the new power users have over data? How will publishers protect their core competencies, now that data can be found far from the actual URL the data originated from? This article will look at the skills designers need to design for interactivity "web 2.0 style," and in turn how publishers and advertisers can utilize these tools that their designers use to get higher rankings on SERPs. Apart from higher search engine rankings, we will also look at how they can get their information to where the eyeballs are, as the search engines are no longer the sole custodians of databases where users get URL listings.
A few articles have already being written on RSS, XML and even AJAX on both SEO Chat and partner sites such as Dev Articles. A short list of links to some relevant tutorials and relevant articles (one being Danny Wall’s "How to Get Extremely High Rankings on Yahoo" which talks about using RSS to become a content aggregator) will be included at the end of this article. I don’t guarantee to give all the answers per se, but reading this article will at least provide you with a guide when you are considering how to update your website for future users. The article should also be a good guide for websites that are starting up and intend to be publishers (who seem to be the only ones who consistently enjoy the search engines’ favor).
In the late nineties, PHP (preprocessor hypertext) caught on with web designers, and websites were built using PHP to parse files and to correspond between web browsers and databases (database driven sites). Suddenly it was easier for designers to build custom scripts without resorting to "serious" programming.
This was also when web 2.0 began, and really it was the web 2.0 companies (like Amazon, Google, Napster) that survived when the Internet bubble burst. Web 2.0 introduced the concept of user-generated content, with Amazon allowing users to write reviews of their favorite books, Google incorporating PageRank which is simply whether other sites consider your own site important enough to link to, and Napster doing peer to peer downloads. The more users each site actually had, the more important the sites became (which is the whole point of web 2.0). These websites were predominantly database-driven "dynamic" sites, which generated constantly changing data and pages.
No longer was a website built by one person, maintained by one person and viewed only by typing a URL into the address bar. Search engines crawled pages, indexed them and put the information on their databases. Now searchers could type in key words and get content listing URLs and some meta data. Directories listed URLs after sorting them into categories and placed profiles next to them. Designers began to design "search engine friendly sites" and now, designers have started building "blog friendly sites."
The most important thing designers should note when designing for a situation where the content on the site may be viewed away from the site it is generated from is that the "aesthetics" of the site are actually not as important as they used to be. It is no longer about just how "cool looking" your site is, though that is good — suddenly it is about how to get information to the highest number of locations as soon as it comes out.
This is especially true with all of the blogs that publish content regularly and feed hundreds of subscribers with data about data, or "meta data." It means that a user could arrive at any page on your site, because bloggers don’t link to your home page, but rather the content they’re writing about. Still, while blogs are important because of search engines, the bookmarking sites actually take the content to where the eye balls are.
Google uses link structure as an important factor in ranking pages. Bloggers are ubiquitous (and also really fast at updating for new content) in linking to sites, which makes blogs one of the most crawled sectors on the web. Bloggers however seem to respect only content (or killer apps). Some sites I consider to be purely "blogs;" these output RSS from various sites and also provide reviews and links back to the content.
Designing for blogging may require an RSS feed (taken for granted really). Getting your users to link back to your sites from their blogs will be an excellent linking strategy (and will keep you from having to do it yourself).
Blogging provides user-generated links back to your site. This is fast becoming a surefire way of getting a huge number of links, and thus improve one’s ranking in the search engines. You do not have to be a content provider to attract content (and hence attract bloggers), but you do have to channel it, by gathering content and bringing it to your site via RSS feeds. The search engines notice the activity on your site (it gets new content every time your content providers get new content) and activity always improves rankings. Also your users notice that they get more relevant resources on your site; hence they blog about it and output your own RSS on their sites. For businesses that are not content publishers this is a good way of getting users to actually "advertise" for you via blogging on their own sites, and also improve your linking structure.
Websites used to be all about structure and styling, using HTML for table structures and CSS for styling (and later structure with DIV tags). Now the most important thing about websites is actually how the content is arranged. This has being shown by how important search engines have become, and how websites have started designing around search engines (that is what SEO is all about, after all). Now websites actually have to design with the concept that some users will also distribute their content for them.
Tagging, unlike blogging, actually increases hits, as the book marking sites are slowly becoming quick changing databases. Apart from that tags can be overlapping depending on the user. It is no longer a bot which reads your key words and categorizes your pages in a database; now users read your content and tag it as they like, meaning different users can tag this article differently. for example, one can tag it "Internet marketing," another will tag it "design," and for all I know, some may tag it "balderdash."
Social bookmarking brings your web site URL into communities, and can actually generate a large number of hits straight from the bookmarking site. For example, there was a URL on Digg that got 1658 visits in 16 hours; that is a lot of hits in 16 hours!
You can encourage tagging by "inviting" your users to tag the site. Take a look at the SEO Chat page and you’ll see that it links to Digg, Simply, Furl, del.icio.us with a specific call to action. Note that not all of your users will be engaged in advertising for you; not all will add your link to their blogs, even if they think your content is compelling (most will read it and roll over). The numbers that will actually tag your site may be a very low percentage (think less than one percent), but once it gets into that community, the eyes have it.
Semantic Markup, Dynamic sites
Approaching closely are new web standards, as the World Wide Web Consortium (W3C) aims at standardizing websites for XHTML, but even this will not be adequate to describe content accurately. XML formats such as RSS help in syndicating (distributing) content, but for the content to be more adequately described, languages such as OWL and RDF are harbingers of an approaching change in the way future websites will be built and described to search engines and other automated database compilers.
Designers will have to become more like programmers (apart from also becoming better Internet marketers) in order to not only adequately describe data, but also to track user behavior and to move constantly changing data around. As one analyst over at Lagos-based software development company Blue Interface remarked, "Brochure-based sites may as well be dead."
Web services allow anyone to build an interface to content on any domain if there is a web services API (Application Program Interface) provided by the developers on that domain.
In the early days, simply getting people to post about you on forums guaranteed an online following. Forums and message boards have since grown more numerous. But this was mostly for web sites. What do companies such as Federal Express or Toyota do to let users contribute to their content, or to tap into the web 2.0 bandwagon?
For companies which are predominantly off line, but simply use the Internet as a marketing tool, an excellent model for customer engagement is GM’s Saturn, which has made a pretty boring compact become a hit for all age ranges and demographics, simply by letting the users get together and rave about the product (this was predominantly offline). BMW has also started generating a lot of buzz online by the amount of publicity generated offline by them building "controversial" looking cars. Now there are whole sites dedicated to BMW bashing, where owners and aficionados go live and air their opinions.
Movies, music, and other product sites allow their users to review their products. By avoiding self service and by not censoring bad reviews, companies which simply use the Internet as a promotional avenue (record labels, book publishers) have allowed online cyberbuzz to connect with offline buzz. An excellent illustration is the controversy which surrounded the choice of Daniel Craig as "James Bond" in Casino Royale. A lot of "negative" cyberbuzz was generated, however it actually increased the amount of attention which was given to the movie.