Over the next two days, I’m handing Suzemuse over to my friend Don. He has a really compelling take on the phenomenon of Web 3.0 and where this new media stuff  is all headed. He starts us off today with a history lesson, on how we got from Web 1.0 to where we are now. Enjoy the post, learn lots and I encourage you to open a dialogue in the comments!

In 2003, I was asked to write a piece, called a “tech scan”, on what impact then advances in technology would have on the justice sector. After spending several days reading ZD-Net, C-Net, and other corresponding reputable on-line sources, I settled on definitions for bleeding and leading edge technologies, linking them to Gartner’s “hype cycle.” I then chose specific technologies to focus on. Gartner is a multi-national consulting firm, whose highly paid consultants monitor technology and produce highly regarded projections for CEOs (Chief Executive Officers), CIOs (Chief Information Officers), and CTOs(Chief Technology Officers) to follow. My choices, pervasive computing (aka: ubiquitous computing), radio frequency identification (RFID), identification frameworks, fuel cells, and web services (system to system communications), agreed with Gartner’s predictions. I even put together a scenario for how these technologies could work together. When completed, I concluded that determining technology trends and forecasting their effects the way I did it was tantamount to standing against the tide. Sifting through mountains of disparate information on technologies to determine what promise they had was both labour and time intensive. At the end of the piece, I strongly recommended my employer dedicate resources to regularly monitor emerging technologies and leverage existing trend forecasts.

It is far easier to put together a tech scan on the impact of advances in web technologies, as the web has three iterations already defined. They span past, present, and future. Web 1.0 begot what is presently Web 2.0 and Web 3.0 is emerging. Each iteration is not a version per se. The World Wide Web (www) has not been upgraded. To explain, let us start, somewhat counter-intuitively, with Web 2.0. This is because it is the only iteration that has an accepted formal definition, coming from Tim O’Reilly. O’Reilly, however, should not be confused with the English Physicist, Tim Berniers-Lee.

Berniers-Lee is largely accredited with inventing the web in 1989. During the proceeding years, leading up to March 10, 2001, a rash of Internet-based companies, commonly referred to as “dot-coms”, were founded. They saw their stock prices multiply by simply associating themselves to the growing fervour around the web. Investors concerned themselves with increasing their market shares without any concern about profitability. Many dot-coms had neither discernible products nor services. Few had articulated business plans. Their goal: establish a recognized brand, claim market share even at a sustained net loss, and charge profitable rates for services afterward. The bubble burst, which lead to the emergence of Web 2.0.

Pre-bubble, building web applications or designing websites was incomprehensible to the general public. Afterward, web application developers and web designers were treated with disdain. Those of us who left university with formal training in building software faced some hardship starting our careers post-bubble. However, the web continued to evolve. While the more resilient dot-coms struggled to re-invent themselves (e.g. Amazon and Yahoo), traditional publishers discovered a new digital medium in the web and found their way onto it, only to discover that non-traditional content was already present and thriving, the blogosphere. People had flocked to the web, using it to create new ways to communicate with one another and share ideas. The wasteland that was the web had again grown lush with opportunity.

In 2004, O’Reilly, CEO of O’Reilly Media, held a brainstorming session with a trade show production company called MediaLive International. Its intention was to address the “state” of the web, determine its possible future, and demonstrate that it can again generate revenue. During the session, O’Reilly Media publisher Dale Dougherty coined the phrase Web 2.0. A year later, O’Reilly published a blog, soundly defining Web 2.0. Accordingly, it has the following characteristics:

  1. The Web as a Platform: services are built and delivered from the web (e.g. google)
  2. Harnessing Collective Intelligence: web both empowers user generated content and leverages that content to foster communities and harness collaboration respectively (e.g. blog’s, wiki’s, and RSS feeds from them)
  3. Data is the Next “Intel Inside”: data drives the services on the web, but successful services depend on embracing and enriching it with contributions from many sources (e.g. google maps)
  4. End of the Software Release Cycle: software is delivered as a service
  5. Lightweight Programming Models and Mash-ups: simpler solutions are built by leaner development teams who can innovate by assembling together existing solutions
  6. Software Above the Level of a Single Device: services are no longer limited to desktops or laptops and can leverage other devices like consumer electronics
  7. Rich User Experience: user interfaces have more dynamic interactions, but are based on open standards (e.g. CSS for layout, XML for data, XHTML for markup, Javascript and DOM for interaction)

Essentially, he defined a piece of jargon that encompassed all of the new business models and service approaches that formed post-bubble, labelled everything that came before as Web 1.0, and dismissed Web 1.0 as antiquated. Based on his redactive efforts, the 1.0 incarnation of the Web began in 1990 and ended in 2000. Its 2.0 incarnation began in 2001 and continues today.

In 2006, Berniers-Lee took issue with Web 2.0 in a podcast, saying that, as jargon, it overlooks the fact that Web 1.0 was equally about connecting people. The jargon is also interpretable. If Web 2.0 for a particular person is about blogs and wikis, then it is about people. This is precisely what the Web was initially designed for, a collaborative space where people can interact. In other words, people have been putting word to web, long before the emergence of the chronological blog or the permalink. The wrinkle is ease with which to publish online content and increased collaborative ability.

Though, there are signs that interest in Web 2.0 has begun to wane. In 2007, ZDNet’s Dan Farber blogged that Web 2.0 is not dead, but Web 3.0 is “bubbling up.” The piece was a summary of the proceedings from that year’s Web 2.0 “summit.” Highlights include the determination that Web 2.0 companies seem to be all in some stage of the cycle of pursuing funding from venture capitalists, building marketplace, and then attracting purchase from larger corporations. Few big innovations have been produced. Instead, too many entrepreneurs have come to market with add-on features not products. Too many products have been produced to crowd the same space. And, too many start ups have tried to mimic an already successful product.

(come back tomorrow for part II!)

—-

Don is an IT Professional with over a decade’s experience in a field where he has been called many things; application developer, database administrator, web-application architect, technical analyst, and security analyst.  More recently, the word “senior” prefixes his titles, which he claims is to reduce white-space on his business card.  Interestingly, he started life as a sociologist, but turned to IT to pay the bills.  Doing so, he became infatuated with information and flattening the knowledge pyramid (data -> information -> knowledge). That infatuation persists today. You can follow Don @foodieprints on Twitter.