Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Web 2.0 as Syndication'

Web 2.0 as Syndication

Roger Clarke **

Version of 24 September 2007

PrePrint of an article published in the Journal of Theoretical and Applied Electronic Commerce Research 3,2 (August 2008) 30-43, at http://www.jtaer.com/portada.php?agno=2008&numero=2#

© Xamax Consultancy Pty Ltd, 2006-07

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at http://www.rogerclarke.com/EC/Web2C.html

The slide-set used to support a presentation of the paper is at http://www.rogerclarke.com/EC/Web2C.ppt (Warning: 5MB!)


Abstract

There is considerable excitement about the notion of 'Web 2.0', particularly among Internet businesspeople. In contrast, there is an almost complete lack of formal literature on the topic. It is important that movements with such energy and potential be subjected to critical attention, and that industry and social commentators have the opportunity to draw on the eCommerce research literature in formulating their views.

This paper assesses the available information about Web 2.0, with a view to stimulating further work that applies existing theories, proposes new ones, observes and measures phenomena, and tests the theories. The primary interpretation of the concept derives from marketers, but the complementary technical and communitarian perspectives are also considered. A common theme derived from the analysis is that of 'syndication' of content, advertising, storage, effort and identity.


Contents


1. Introduction

Since 1993-94, the Internet and the Web have provided opportunities for business-to-consumer marketers. Since 2004, a new round of innovation has been heralded, which is referred to using the term 'Web 2.0'. Yet, by mid-November 2006, ISI Web of Science listed only three articles on the topic. Even by June 2007, very few additional articles of consequence had appeared, and as yet none of them had any citations. Even Google Scholar revealed only a very limited formal literature on the topic. See, however, McManus & Porter (2005) addressing web-site designers, Miller (2005) considering its applications to libraries, Millard & Ross (2006) comparing it with hypertext, and Schraefel et al. (2006) comparing it with the semantic web. As this paper was finalised for publication, Best (2006) came to light, and a book was published (Vossen & Hagemann 2007).

It is very challenging to impose rigour in circumstances in which the phenomena are unstable, potentials are unfolding progressively, and the most energetic among the actors have a vested interest in change rather than in retention of the status quo. Should researchers cede the field to commentators who are free to apply their insight and wisdom alone, without the strictures of theory, analysis and evidence?

The position adopted in this paper is that, if eCommerce researchers are to deliver results relevant to CIOs and management more generally, it must find ways to cope with highly dynamic contexts. New phenomena need to be subjected to examination, the propositions of pundits need to be tested, and theory needs to be built. For researchers to fulfil those responsibilities, some compromises to rigour must be accepted. On the other hand, it is vital that those compromises be identified and declared, and over-stated conclusions avoided.

The purpose of this paper is to interpret the Web 2.0 notion, seeking an answer to the question as to whether it is a mirage created from empty 'marketing-speak' or a tsunami of revolutionary change that will sweep over the Web and greatly alter the practice of eCommerce. It is addressed primarily to eCommerce researchers, but will also be of interest to practitioners and observers.

The paper commences by examining the propositions and prognostications of its originators and champions, placing them within the context of earlier Web applications. It then considers the phenomena from several perspectives. The primary assessment is from the perspective of the marketers and conference organisers who have generated the excitement surrounding Web 2.0. The elements fall into four related areas which share a common thread of 'syndication'. Rather different interpretations are evident when first the technical and then the communitarian perspective are adopted. Implications for business and research are drawn.


2. A Brief Retrospective on Marketer Behaviour on the Web

It is easy to forget that the World Wide Web was conceived in 1989-90 as infrastructure for information management and reticulation - "a common information space in which we communicate by sharing information" and more specifically "a global hypertext space ... in which any network-accessible information could be refered to by a single 'Universal Document Identifier'" (Berners-Lee 1998).

By late 1994, however, a workable transactional capability had been grafted on. The key elements were web-forms to enable user data to be captured, the HyperText Transfer Protocol (HTTP) POST method to enable data to be transmitted to the Web-server, and Common Gateway Interface (CGI) scripts to enable data to be interfaced with backend applications and databases (Clarke 2002). Supplementary technologies emerged very quickly, including means to maintain and manage the 'state' of successive interactions (such as cookies), server-side tools (such as automated customisation of HyperText Markup Language - HTML), and enhancements to client-side processing (such as Javascript/ECMAScript, plug-ins and Java).

Revenue-based activity is generally considered to have arrived on the Web in October 1994, when Hotwired launched an advertising-supported, content-rich web-site - although O'Reilly (2005a) claims that a prior instance existed as early as September 1993. Feverish activity ensued, as first advertisers and then marketers more generally sought ways to harness the new opportunities, ride the coat-tails of the rapid growth rates, and convert the hordes of hitherto idealistic community members into avid e-consumers.

The next 5 years saw a litany of disastrous attempts, chronicled in Clarke (1999). First, Schrage (1994) trumpeted the co-option of the Internet by the advertising industry: "There will be billboards along the Information Superhighway". A mere two years later, this was retracted by another Wired author who wrote of 'The Great Web Wipeout' (Bayers 1996).

Next, attempts were made to proprietise the Internet. The era of public bulletin-board services (BBS) had spawned proprietary on-line services, like The Source, Prodigy, CompuServe, GEnie and Apple eWorld. Successive attempts by Microsoft in 1995-96, variously to take over the Internet, to create a closed segment of the Internet, or to create a parallel service, all met dismal failure (Wolfe 1994, Caruso 1996). AOL did manage to sustain and, for some years at least, to grow its closed electronic community. Microsoft and others appear to be still hopeful that some service or other (currently Instant Messaging) will provide the means for them to privatise large segments of Internet business, while Google and eBay seek the same result variously through search-engine add-ons, content-hosting services and VoIP.

In 1997, the confident prediction was made that 'web-casting', represented by products such as PointCast, Marimba's Castanet and Netscape's NetCaster, would invert the HTTP protocol and push data to users (Kelly & Wolf 1997). This time it took a mere 12 months for Wired Magazine to see the light: "The most surprising roadblock for push ... comes not from technology, but from the intended audience. Push media's promises are often met with outright resentment from computer owners glad to be free of TV's broadcast model. ... Consumers are smarter than media organisations thought" (Boutin 1998).

At about the same time, Hagel & Armstrong (1997) advanced the thesis that the Internet was giving rise to a new model of consumer-supplier interaction, which they called 'the virtual community business model'. In particular, they perceived a shift in market power to 'info-mediaries', whose primary affinity needed to be to consumer-members. The engine underpinning this new order of things was to be information capture, to support profiling of both consumers and suppliers.

Then followed portals. These were designed to be the automatic first-port-of-call for large numbers of captive net-users eager to part with their money. Portals were expected by this round of soothsayers to command high advertising-rates and click-through fees, plus commissions on resulting sales (Nee 1998). Key services that such portals could offer were argued to be free email, plus personalised launch-pages, with automated page-change notification and news customisation (Thornberry 1998). Nine years later, some portals have survived, but the market-share achieved by most is far less than their sponsors had envisaged, and their business models are rather different as well.

By the late 1990s, a great deal of attention was being paid to the acquisition of data about individual consumers (Bayers 1998). In some cases, data capture was performed legitimately, through web-forms and 'the click-trail'. In many cases, however, it was achieved surreptitiously. Waves of new technologies have been devised and deployed, including pseudo-surveys, intrusive applications of cookies, applications of ActiveX, web-bugs, adware and spyware more generally.

There have been proposals that the marketing discipline and profession transform themselves to reflect the new realities of the Internet. These have included Pine's (1993) notions of 'mass micro-marketing' and 'mass customisation', Peppers & Rogers' (1993) concept of 'one to one' consumer marketing, Hoffman & Novak (1997), and Kelly's (1998) 'new rules'. Levine et al. (1999) argued that 'markets are conversations' and that consumers are getting smarter, have more information, are more organised, change quickly, and want to participate with corporations, but in return for their money they want the corporation's attention (Theses 1, 10, 30, 60, 62, 78, pp. xxii-xxviii).

But most corporations weren't listening (Thesis 14, p. xxiii). The leopards were having great difficulty changing their spots, because the perception of 'consumer as quarry' was too engrained. The task of the marketing visionary still appeared to be to make the new world of the Internet as much as possible like the old one of broadcast media. This was a key part of the context in which 'the dot.com bubble' burst in 1999-2000 (Wikipedia entry, accessed 16 November 2006. See also Cassidy 2002).


3. The Emergence of Web 2.0

In order to gain a clear understanding of the nature of the phenomenon, it is important to consider its origins. 'Web 2.0' is a brand-name created in an endeavour to revive the momentum lost when investors realised their errors and fled the 'dot.com' venture capital market in about 2000. The movement commenced in 2003 or 2004 in a brainstorming session (O'Reilly (2005b). The term's coinage was ascribed by Madden & Fox (2006) to Dale Dougherty. The movement revolves around a series of commercial conferences, and the key players are marketers and IT suppliers.

One player is the eloquent, visionary and exciting (but not infrequently wrong) Kevin Kelly, a long-time driver of Wired magazine (e.g. Kelly 2005). The movement has a clear ancestor in his 'New Rules for the New Economy' (Kelly 1998, rules summarised on p. 161). See also Levy & Stone (2006). This paper will argue that there are points of substance inamongst the vague, motivational marketing hype. In order to gain an appreciation of current changes and where they might lead, there is a need to extract that substance.

A definition of 'Web 2.0' remains elusive (Yourdon 2006). There is a shortage of serious papers on the topic, with most contributions being loose trade-press articles or verbal presentations at conferences supported by slide-sets: bullet-lists as a substitute for reviewable analysis and argument. The movement does however have a manifesto, in the form of O'Reilly (2005b). According to O'Reilly, the original brain-storming session formulated "the sense of Web 2.0 by example", as shown in Exhibit 1.

Exhibit 1: Web 2.0 by Example - September 2005

Web 1.0 Web 2.0
DoubleClick-->Google AdSense
Ofoto-->Flickr
Akamai-->BitTorrent
mp3.com-->Napster
Britannica Online-->Wikipedia
personal websites-->blogging
evite-->upcoming.org and EVDB
domain name speculation-->search engine optimization
page views-->cost per click
screen scraping-->web services
publishing-->participation
content management systems-->wikis
directories (taxonomy)-->tagging ("folksonomy")
stickiness-->syndication

Source: O'Reilly (2005b)

At a superficial level, Web 1.0 is asserted to reflect old, centralised patterns of business, with corporate-controlled resources serving remote consumer-clients. Web 2.0, on the other hand, has a much more open texture, with consumer-clients actively participating in the services, and shaping them. Several of the pairs are considered in detail below, in order to identify key elements and extract the common theme.

Central to the emergence of Web 2.0 has been a series of commercial conferences, driven by O'Reilly's company. They were initially dubbed the 'Web 2.0 Conference', but the title has since been elevated to 'Summit'. Events have been held in in October 2004, October 2005, November 2006, April 2007 and November 2007. At the first conference, presenters identified the applications listed in Exhibit 2 as innovations that were leading the change.

Exhibit 2: Examples of Innovative Web 2.0 Applications - October 2004

Source: Canter & Fried (2004), slide 2

Insights offered in presentations at the first conference included that the Web is maturing into a platform for many further innovations, there is a vast amount of data that can be leveraged off, and businesses are leveraging user-generated content.

A particular claim is that what is referred to as 'the long tail' is generating a lot of business. This notion derives from Anderson (2004, 2006), which in turn drew on a research article (Brynjolfsson 2003). It refers to business models, such as those of Amazon and Apple's iTunes business line, which evidence the familiar 80-20 pattern by drawing most of their revenue from a relatively small proportion of their product lines, but which also exploit the large total volume of products each of which attracts only a very small amount of business. Particularly in digital contexts, low-volume lines need not be unduly expensive to sustain.

In 2005, a keynote speaker asserted that the 'data points' in Exhibit 3 were important indicators of Web 2.0's context.

Exhibit 3: Web 2.0 'Data Points - October 2005

Source: Meeker (2005), at slides 4-5

Meeker's slide 15 identified what she saw as an important additional driver:

The first few years of the movement have been characterised by excitement, lists of interesting things that people are doing, and vague new terms. The following section seeks to bring a greater degree of analytical care to the topic, and rather more scepticism.


4. Key Aspects of Web 2.0

The primary analytical contribution from within the Web 2.0 movement is O'Reilly (2005b), but see also Best (2006). O'Reilly identifies the following key distinguishing features of the new order:

This is an inadequate foundation for discipline-based studies. The study of new phenomena depends on a framework being developed that identifies and describes key elements and their inter-relationships, and enables an appreciation of their impacts, and their implications for various actors. Only then does the location and application of suitable bodies of theory become feasible. O'Reilly's depictions are accordingly considered in greater depth below.

Following an iterative examination of the Web 2.0 movement's own materials together with the available commentaries in professional and industry literatures, the common aspect that emerged is most usefully described as 'syndication'.

The term 'syndicate' originally referred to a loose, unincorporated affiliation of people or (later) companies, typically in the role of investors, along the lines of a consortium or joint venture. Later, 'syndication' came to be used to indicate a means of distribution of media material such as commentators' text, sports photographs and cartoons. The following sub-sections explain how the concept has been generalised, how it has been applied in another, related context, and how it can be usefully applied in several further contexts.

The key aspects of Web 2.0 from the marketing perspective are addressed in the following sub-sections:


4.1 Content Syndication

A key departure-point for the Web 2.0 business movement is the vast amount of content now being made available open and gratis on the Web. The Web was devised as a means of publishing content. It has been extended to enable content-originators to not merely make content available to others, but also to make it easy for those interested in knowing about new content to become aware of it, and to receive it.

The term 'syndication' has recently come to be applied to arrangements under which a party that originates content licenses others to utilise it, in particular by re-publishing it or re-publishing information about it. The term 'content syndication' is usefully descriptive of the integration of content publishing, promotion and distribution into a single mechanism.

As noted above, the concept pre-exists the Internet, for content such as sports photographs and cartoons. An early electronic service was personalised e-newspapers, which emerged before the Internet, and were delivered by fax. These were generated by 'software agents' that scanned indexing services, applied keywords that reflected the person's interests, and consolidated the selected articles into a customised document. Such services are less evident now than they were in the late 1980s through to the mid-1990s.

As the Web exploded in popularity from 1993 onwards, the amount of content accessible using web-browsers increased. To improve the discoverability of needles among the increasingly large haystack, search-engines emerged from 1994 onwards, in particular WebCrawler and AltaVista.

A proportion of content-providers utilised the scope that electronic media provides to progressively update content. A complementary capability was developed whereby interested parties could request email notification each time a web-page changed (Bell 2002).

During 2003, blogs (an abbreviation of 'web-logs') emerged (Lindahl & Blount 2003). It emerged that HTML editors had represented an impediment for many people, and through the removal of that impediment even more content was unleashed. The volume of postings appeared to be levelling out at about 1.3 million postings per day around the beginning of 2006 (Sifry 2006). The quality of the content in the 'blogosphere' is on the whole very low, but commercial opportunities arise nonetheless. Because discovery and navigation are challenging, further mechanisms have emerged such as the bookmarking of blogs, cross-references between blogs (sometimes referred to as the 'Who I'm reading' feature), and restriction of searches to blogs only. The clumsiness of blog architecture suggests that the field is ripe for the next (r)evolution, which is likely to offer yet more commercial opportunities.

Beyond those fairly limited capabilities, content syndication is supported by the Atom and RSS standards. These make 'feeds' of recently-published content available in a standardised, machine-readable format. Users can subscribe to feeds, and employ software to periodically display new and updated articles.

A small percentage of Web content directly raises revenue to off-set the costs involved in creating and publishing it. A further small percentage is supported by cross-subsidy from revenue from complementary activities such as consultancy services. A great deal of Web-content, on the other hand, whether on personal, corporate or organisational web-sites, or in blogs, is of the nature of 'vanity press' publishing, with the costs absorbed by the originator. Content syndication is enabling this vast 'open content commons' to be easily promoted, discovered and distributed. Consistent with the 'long tail' dictum, the high aggregate level of accesses to sites each of which attracts relatively small traffic volumes, creates opportunities to build businesses.


4.2 Advertising Syndication

During the last few years, a new advertising model has emerged as a means of exploiting the open content commons for business purposes. The implementation most familiar to Internet users is Google's, although the model was actually pioneered in 2001 by Overture, which was subsequently taken over and is now known as Yahoo! Search Marketing. This section briefly explains the mechanics of what is usefully described by the term 'advertising syndication'.

The native form of the Web is the delivery of static HTML. As noted earlier, this had been extended as early as 1994 to enable the HTML to be customised 'on the fly', using data in the page-owner's databases to improve service to the requestor (Clarke 2002). The new model goes much further, however.

The foundation is 'pay-per-click' advertising, an approach most familiar in the form of Google's AdWords. Under this scheme, advertisers use key-words to indicate what the ad is about. They subsequently pay not for the appearance of an ad, but only when someone clicks on it. This has the benefit that only effective ads are paid for, as indicated by the fact that consumers who vote with their fingers are affirming that the ad displayed to them is relevant to their interests at the time. Pay-per-click appears to be supplanting pay-per-ad as the dominant approach used on the Web. Its effectiveness has also attracted a proportion of advertising budgets previously assigned to conventional advertising channels. Its impact may even be so great that it could have substantial impacts on the economics of for-profit publishing (Cringeley 2005).

Initially, pay-per-click was implemented on relatively small numbers of web-sites each of which attracts large numbers of visits. The most familiar operator in this area is Doubleclick, which applies the conventional advertising agency business model by renting out space on pages on large web-sites. The next step, in conformance with the 'long tail' dictum, has been to apply the 'syndication' concept, by attracting the owners of the vast numbers of much smaller and/or much-less-visited web-sites into 'affiliation' with advertisers. The approach is most familiar to Internet users in the form of Google's AdSense.

The mechanism involves several key elements. Page-owners:

This represents a much more refined way to capture the advertiser's dollar. It enables the mining of data in order to achieve much better targeting of advertising, and the extension of advertisers' reach out to the wide array of micro-markets defined by the vast library of community-generated content.

To the consumer marketer, this is sophisticated and attractive. From the viewpoint of Web users, on the other hand, advertising syndication has the following features:

This has led some commentators to anticipate that the Web 2.0 movement will lead to "a marked loss of privacy on the Internet" (Harris 2006). Further consideration is given to the implications of advertising syndication in section 5.2 below.


4.3 Storage Syndication

The previous sections have outlined the ways in which Web 2.0 sets out to exploit and 'monetise' publicly available content, and people's requests for access to it. The movement may also be in the process of leveraging Web-users' storage facilities.

Peer-to-peer (P2P) architectures are complementing longstanding client-server architecture. Judging by the large proportion of total Internet traffic attributed to P2P networks, P2P may even be in the process of replacing client-server as the dominant architecture. This has been most noticeable in the areas of entertainment content, where, since Napster in 1998, a great deal of material has been shared among consumers in defiance of copyright law.

A new phase has begun. The success of Apple iTunes, the limited success of litigation threats by the Recording Industry Association of America (RIAA) against P2P operators and particularly against consumers, and the ongoing use and maturation of P2P, have together forced content-owning corporations' hands. At long last, the feature-film industry began negotiating with BitTorrent in 2005, in order to apply the technology to legal, for-fee reticulation of video (BitTorrent 2006). And in 2006, the music industry finally got over its years of denial, and entered into an accommodation with major P2P providers Sharman and Altnet.

There are many potential applications for P2P beyond music and video entertainment. They include audio, image and video related to news and education, emergency management information, urgent virus signatures and security-related software patches (Clarke 2006). A further category of content that is already stored on P2P users' devices is the paid advertisements that fund the operations of the P2P networks. The advertisements are reticulated through the same highly distributed network as the content itself.

In short, P2P is becoming 'mainstreamed' and respectable. P2P does, after all, have some attractive features for content-providers. Storage syndication utilises consumer-customers' storage-devices and bandwidth in order to avoid companies' server-farms and network-connections becoming congestion-points and single-points-of-failure, but also to transfer the costs from producers to consumers.

Advertising syndication and storage syndication are natural partners. As P2P extends into new markets, and as the P2P field becomes dominated by for-profit corporations eager to supplement fee-for-service with advertising revenue, the space on users' disks appears likely to be exploited by corporations, possibly non-consensually, but at least with only limited consumer understanding about what the software licensing terms authorise.


4.4 Effort Syndication

O'Reilly summarises his messages about the user involvement element using the term 'architecture of participation'. More upbeat expressions include 'harnessing collective intelligence', and 'the surging wisdom of crowds' (Madden & Fox 2006). This sub-section suggests that these can be perceived as another form of syndication, leveraging off the efforts of community-members.

The contributions of user involvement to the Web 2.0 cluster of ideas appears to hinge on a few recent phenomena. The first is self-publishing, discussed above under the topics of 'vanity press' and content syndication. This began with posts to Usenet news, fora and email-lists, migrated to web-pages, is currently focussed on blogging, and may well migrate again in the near future as the inadequacies of blog architectures are addressed.

A second phenomenon is collaborative publishing, whose dominant form is currently wikis. These enable multiple individuals to edit content after it has been created. Documents may be open for editing by anyone, or collaboration may be restricted to people who satisfy some qualifying conditions. The most celebrated individual site of this kind is Wikipedia. This is an encyclopaedia that was created and is maintained by volunteer authors, and is available gratis. In late 2006, there were over five million articles, in about 250 languages. The human behaviour exhibited in relation to the endeavour covers the full range from altruism and professionalism to mediocrity, banality, venality and nihilism (Denning et al. 2005 and, less confidently, Correa et al. 2006).

A third area of development is free-text metadata, particularly in the form of keywords, descriptors or meta-tags for photos and video-clips. This has been dubbed 'folksonomy', to distinguish it from disciplined taxonomies and defined meta-data vocabularies. A celebrated application is to assist in the discovery of photographs at flickr.com.

One of the innovations associated with flickr.com was tag-clouds. This is a visual representation of the frequency with which other terms are associated with any particular term. It represents a statistical or 'brute force' means of discovering concept-clusters, which (in addition to its primary function of being exciting) can represent a valuable empirical supplement to more considered analytical approaches. Exhibit 4 contains a published example of a tag-cloud relevant to this paper.

Exhibit 4: Tag-Cloud for 'Web 2.0' - November 2005

Source: Budd (2005)

During 2005-07, there have been energetic endeavours to leverage off the personal data captured into social networking systems (SNS). People donate 'profiles' of themselves and others. These are then linked into social networks in various ways. SNS could be in the process of giving rise to what might be termed 'identity syndication', whereby each digital persona is defined by the combination of profile data captured by the person and others into one or more SNS, and the apparent web of contacts generated within them.

These features appear to be currently useful to business in three main ways:

The term 'effort syndication' is a reasonable descriptor for business taking advantage of communitarian activities, and diverting them into the for-profit space or otherwise 'monetising' them.

This is not the first occasion on which such proposals have been made. Although there seems to be little evidence of Hagel & Armstrong's 'info-mediary' theory bearing fruit, the Web 2.0 movement is trying again, a different way. There are doubts about the appropriateness of O'Reilly's use of the term 'architecture of participation', however, because this conception of Web 2.0, rather than fostering participation, leverages off it. The expression 'architecture of exploitation' would appear more apposite, because it carries with it both a positive business tone and a negative sense in a social context.


5. Alternative Interpretations of Web 2.0

The previous sections have investigated the original and mainstream usage of the term 'Web 2.0'. Two other flavours can be identified, one associated with information technologies, and the other with communities. They are considered below, partly as tests of the marketing interpretation, partly as critiques of it, and partly as counter-proposals, competitors and even attempts to subvert or divert the energy that corporations have invested.


5.1 The Technical Perspective - AJAX

Although marketers and advertisers have colonised the Web, they remain dependent on information technologies, and on people with the requisite technical expertise and skills, and the capacity to innovate. The Web began as a document retrieval tool, and at heart it remains just that. Technologists continue their efforts to adapt the supporting infrastructure to service the needs of marketers. A cluster of technologies that is collectively referred to as AJAX has been closely linked with the emergence of Web 2.0. This section provides a brief overview of AJAX.

In O'Reilly (2005b), it is proposed that the nimbleness necessary in Web 2.0 applications depends on frequent, small and quick enhancements and extensions. This is in replacement for the long, slow release cycles of large packages. To long-time participants in the IT industry and the IS profession and discipline, this represents a welcome but much-belated reversal of the massive monolithisation of software such as ERP packages and Microsoft Office, and the re-discovery of the general systems theory principles of modularisation and de-coupling.

Various commentators perceive 'mixing' and 'mashups' to be the archetypal category of Web 2.0 application. A mash-up involves the more or less ad hoc combination of content from multiple sources, such as maps overlaid with descriptive data about the locations and facilities shown on it (Markoff 2005). To enable the integration of diverse applications, modules that have been conceived and developed independently of one another need to have well-defined data interfaces and application programming interfaces (APIs).

O'Reilly talks of 'lightweight programming models', in substitution for the over-blown stacks of protocols that the Web Services movement has been delivering. The key example of such a lightweight model is the approach referred to as AJAX, which is shorthand for 'Asynchronous JavaScript and XML'. The term is of recent origin (Garrett 2005), but describes a pattern that has been emergent for some years and represents a further improvement on longstanding techniques collectively referred to as Dynamic HTML.

The AJAX approach utilises well-established tools such as HTML, Cascading Style Sheets (CSS), eXtensible Markup Language (XML) and the JavaScript/ECMAScript family of client-side languages. Best (2006) provides a useful summary. The key difference is the involvement of the XMLHttpRequest Method recently added into the HTTP protocol. This supports data retrieval from the server 'asynchronously', i.e. without forcing a refresh of the entire browser-window. The technique enables closer control by the programmer of the user's visual experience, because small parts of the display can be changed, without the jolt of an intervening blank window. It is argued that this enables quicker response and improved usability (although that is subject to debate).

The means whereby this is achieved is by constructing an 'Ajax engine' within the browser, such that requests and responses are intercepted and processed on the client-side. An important motivation for Ajax developers is to reduce the complexity caused by proprietary features in Microsoft's Internet Explorer, such that a single application can work consistently on all client-platforms - as the Web was originally envisaged to do. (The revival of inter-operability may prove to be ephemeral, however, as Microsoft finds new ways to close out such generic methods and force developers back to its proprietary tools).

Another limiting factor is the insecurity inherent in such techniques. The corporation's applications are capable of being manipulated, subverted or hijacked, because a considerable amount of active code is visible client-side (e.g. Paul 2007).

From the user's perspective, however, control of the browser-window by code delivered by an application running on the server represents subversion of the concept of the Web and hijack of the functions of the browser. Marketers have repeatedly tried to bully the Web into a means of 'pushing' ads to consumers, despite the inherently 'pull' nature of the HTTP protocol. AJAX at last provides an environment in which the advertiser's dream of web-casting can be implemented. Perhaps 'billboards on the information superhighway' were trumpeted by Schrage (1994) a decade too early. And, now that they can be delivered, they come with a capacity for ad-targeting far greater than was feasible at that time.


5.2 The Communitarian Perspective - The Architecture of Collaboration

The mainstream commercial perspective described in sections 3 and 4 is facilitated by the technical perspective addressed in the preceding sub-section. Together they represent an architecture not of participation but of exploitation - exploitation of not-for-profit content, of the effort of content-creators, of the attention of accessors, and perhaps also of the processors in the Internet-attached devices of creators and accessors, of the storage-space on those devices, and of the consumer-participants' bandwidth.

The individuals whose efforts are being leveraged are in many cases only dimly aware that their behaviour is part of an economy. Instead, most perceive themselves to be participating in a community. As shown in Exhibit 5, many of the most celebrated instances of Web 2.0 applications are entirely dependent upon collaboration within communities.

Exhibit 5: Collaborative Resources Exploited by Web 2.0

There is an apparent conflict between the architecture of exploitation and cyberspace ethos - whose hallmarks include openness, participation, gratis service and freedoms (Clarke 2004). Those elements of cyberspace ethos have been formalised in recent years, with the emergence of the 'open content' movement, in particular Lessig (2001, 2004) and Creative Commons.

The Web 2.0 marketing movement may well be achieving efficiency in marketing communications (although that requires separate study). But, if so, the efficiency is being achieved through appropriation. Most consumers do not comprehend this; but some do, and it appears likely that more will. The sceptical consumer may perceive Web 2.0 as an extension of the web of abuse, deceit and manipulation that was previously practised directly by marketers, and that gave rise to the many false-starts documented in the opening section of this paper.

That web of marketing deceit now extends out, via advertising intermediaries, to many information-providers who were hitherto regarded as being trustworthy. As it becomes apparent that community-members are inflicting adware and even spyware on one another, thanks to Web 2.0 features, both the features and their proponents are likely to fall into disrepute.

Many for-profit corporations appear to be unready to acknowledge that the not-for-profit world has always been vast. They also seem to be unwilling to accept co-existence with that world, and instead remain as committed as ever to its exploitation, and unable to comprehend that social activities might be capable of resisting takeover by for-profit business enterprises. For example, Wikipedia, a collaborative reference repository, has been designed specifically to resist exploitation for profit. Yet the Web 2.0 movement would appear to regard Wikipedia and other such sources of content as merely being 'not yet monetised'. The 'consumer as quarry' marketer-ethos remains in place, and is now being augmented by 'content-originator as unintending donor'.

The 'architecture of participation' is not being interpreted along the human lines of Peppers & Rogers' 'one to one' consumer marketing, or even Kelly's (1998) 'new rules', let alone Levine at al.'s 'market as conversation'. The use within the Web 2.0 movement of terms such as 'user involvement', 'collaborative intelligence' and 'participation' is readily interpreted as a smokescreen for the surreptitious acquisition and consolidation of consumer data, and the manipulation of consumer behaviour, in ways that authors like Packard (1957) and Larsen (1992) never dreamt of.

On the other hand, the venture capital shaken loose by the prospect of new and profitable businesses may transpire to be to the advantage of the communities that business is seeking to exploit. Ideas are being trialled, and experiments are being conducted. Many will fail, but the best of the ideas will survive. Moreover, much of the software will survive as well, because of the prevalence of open interface standards, and of open source licensing. The field is so dynamic, however, that reliable predictions are very difficult to formulate.

The communitarian perspective is therefore somewhat at variance with the marketing vision, in that it seeks to resist or subvert exploitation for profit. It nonetheless embraces cross-leveraging and collaboration of content, effort and infrastructure, in what might reasonably be termed 'social syndication'.


6. Implications

The preceding sections have identified three interpretations of Web 2.0, one mainstream and commercial, a second technical, and the third communitarian. There is substance amongst the over-excited hype. If any of the visions deliver on their promise, or some more mature movement emerges, the changes will have considerable significance for business, and hence for eCommerce researchers. This section considers implications for both of them.


6.1 Implications for Business

The Web 2.0 movement is diffuse, and does not permit simple definition. The common theme of syndication identified in this paper does, however, indicate that there is some degree of cohesion among the scatter of ideas.

Earlier sections of this paper identified many opportunities for business to leverage off other people's content, effort and infrastructure. Effective use by each individual company depends on a clear appreciation of that company's existing competitive advantages. Web 2.0 archetypes can then be matched against those features, in order to craft a strategy appropriate to the company's own strengths. Some organisations will find it valuable to take advantage of intra-organisational opportunities first. This enables learning to take place inexpensively and under close control. Cavazza (2007) draws attention to, in particular, the need to find out what Web 2.0 tools staff are already using and what for, to encourage champions, and to initiate official trials.

In moving outside the organisation, care is needed. In the new context, both business partners and consumers have high expectations of interaction, responsiveness and ongoing adaptation. The prosumer notion, which appeared illusory for the first two decades after the term was coined by Toffler (1980), is finally emergent. In addition to being more demanding, however, prosumers are likely to be more sceptical, and accepting of exploitation only where they perceive that the exploiter provides service and value-add in return.

To what extent Web 2.0 technologies really represent progress beyond Web 1.0 is not entirely clear, and they are both new and diverse, and hence the directions that they will take are not readily predictable. There is also intense competition among hype-merchants for brand-dominance, with literatures and conference-series now also selling 'Web 3.0' (inter-twined with so-called 'semantic web' notions) and even 'Web X.0'. From the outset, sceptics have drawn attention to the risk of implosion, and have parodied the movement as 'neo-dot.com' and 'Bubble 2.0' (e.g. Dvorak 2007). In short, there are not only opportunities for business enterprises, but also substantial risks that need to be managed.


6.2 Implications for Research

The significance of Web 2.0 for eCommerce researchers needs to be considered at two levels. Although it contains at least as large a proportion of 'marketing speak' and vacuity as other movements before it, the Web 2.0 movement draws attention to changes of substance that observers cannot afford to ignore. There are considerable challenges involved in finding appropriate theories to use as lenses, in deciding what to measure, and in finding and measuring it. Despite the probably ephemeral nature of the phenomena being observed and measured, the discipline needs to confront those challenges. If it fails to do so, it abandons relevance in favour of rigour, and cedes the information-space to marketers, leaving commentators bereft of background research to draw upon, and forcing them to combat marketer insight and wisdom with their own. We need to draw on theory where it exists, and to propose theory where there is none. One likely source is strategic theory and its search for sustainable advantage, e.g. as summarised in Clarke (1994).

The challenge that Web 2.0 represents is one instance of a general problem. eCommerce academics need to be able to comment on any relevant movement in the IT sphere. The understandable desire for greater rigour as a means of enhancing our reputation with colleagues in adjacent disciplines has been pursued to the point at which our capacity to conduct relevant research is being seriously undermined. Reverting to intellectual looseness is highly undesirable; but more appropriate balance has to be sought between rigour and relevance.

Another aspect of eCommerce research that is highlighted by consideration of Web 2.0 is the need for it to be relevant not only to business interests but also to society as a whole. This analysis has concluded that the currently mainstream interpretation of Web 2.0 is driven by the for-profit sector, seeks to appropriate and monetise not-for-profit activity, and has potentially negative implications for vast numbers of people, both as content-originators and as accessors. Whether that conclusion is justified is an issue of considerable importance. It is essential that eCommerce researchers embrace the public policy perspective as a legitimate (if challenging) component of the research domain. Policy-relevant research is capable of being approached not only through the 'critical theory' avenue, but also using interpretivist and scientistic techniques.


7. Conclusions

This paper has summarised the Web 2.0 movement based on the present limited and mostly non-formal literature. The intention has been to provide a foundation on which theory-driven research can be developed.

Syndication has been identified as a common feature across all elements and all interpretations. Content, advertising, storage, effort and even identity are being appropriated, integrated, leveraged and monetised. The technical interpretation of Web 2.0 is broadly consistent with the marketing vision, but is oriented towards enabling it to be achieved. The communitarian perspective focusses on syndication, but emphasises social rather than economic aspects.

Web 2.0 is a malleable term, and may disappear as quickly as it emerged. The technologies supporting it might disappear quickly too, in which case it will prove to have been a passing fad - a mere mirage. In the meantime, 'Web 2.0' appears to be as good a generic descriptor as is available of the highly diverse new directions that Web activities have taken during the post-'dot.com implosion' period since the turn of the century. It may be that real change is occurring, and that a tsunami will roll over the top of the newly old Web 1.0.

Observers need to keep Web 2.0 and its constituent threads in view, and eCommerce researchers need to assist in understanding them. To do so, we must build on the framework provided in this paper, apply existing theories, propose new theories, and conduct structured observation of relevant phenomena in order to test, refine and expand those theories.


References

Unless otherwise stated, all URLs were accessed on 26 November 2006.

Anderson C. (2004) 'The Long Tail' Wired 12.10 (October 2004), at http://www.wired.com/wired/archive/12.10/tail.html

Anderson C. (2006) 'The Long Tail: Why the Future of Business is Selling Less of More' Hyperion, 2006

Bayers C. (1996) 'The Great Web Wipeout' Wired 4.04 (April 1996), at http://www.wired.com/wired/archive/4.04/wipeout_pr.html

Bayers C. (1998) 'The Promise of One to One (A Love Story)' Wired 6.05 (May 1998), at http://www.wired.com/wired/archive/6.05/one_to_one_pr.html

Bell S.J. (2002) 'Do I Detect a Change?' LibraryJournal.com, October 15, 2002, http://www.libraryjournal.com/index.asp?layout=article&articleid=CA251668

Berners-Lee T. (1998) 'The World Wide Web: A very short personal history' World Wide Web Consortium, 7 May 1998, at http://www.w3.org/People/Berners-Lee/ShortHistory, accessed 23 September 2007

Best D. (2006) 'Web 2.0: Next Big Thing or Next Big Internet Bubble?' Technische Universiteit Eindhoven, January 2006, at http://page.mi.fu-berlin.de/~best/uni/WIS/Web2.pdf, accessed 28 July 2007

BitTorrent (2006) 'Warner Bros. Home Entertainment Group Announces Revolutionary Deal to Publish Legal Film and TV Content using the BitTorrent Platform', BitTorrent Inc., 9 May 2006, at http://www.bittorrent.com/2006-05-09-Warner-Bros.html

Boutin P. (1998) 'Pushover?' Wired 6.03 (March 1998), at http://www.wired.com/wired/archive/6.03/updata.html?pg=1

Brynjolfsson E., Yu H.J. & Smith M.D. (2003) 'Consumer Surplus in the Digital Economy: Estimating the Value of Increased Product Variety at Online Booksellers' Management Science 49, 11 (November 2003)

Budd A. (2005) 'What is Web 2.0?' Clearleft Ltd, November 2005, at http://www.andybudd.com/dconstruct05/, accessed 23 September 2007

Canter M. & Fried J. (2004) 'Lightweight Business Models', slide-set, Proc. Conf. Web 2.0, October 2004, at http://conferences.oreillynet.com/presentations/web2con/canter_marc.ppt

Caruso D. (1996) 'Microsoft Morphs into a Media Company' Wired 4.06 (June 1996), at http://www.wired.com/wired/archive/4.06/microsoft_pr.html

Cassidy J. (2002) 'Dot.con: How America Lost its Mind and Its Money in the Internet Era' Penguin, 2002

Cavazza F. (2007) 'What is Enterprise 2.0?' FCnet, at http://www.fredcavazza.net/2007/07/27/what-is-enterprise-20/, accessed 23 September 2007

Clarke R. (1994) 'The Path of Development of Strategic Information Systems Theory' Xamax Consultancy Pty Ltd, July 1994, at http://www.rogerclarke.com/SOS/StratISTh.html

Clarke R. (1999) 'The Willingness of Net-Consumers to Pay: A Lack-of-Progress Report' Proc. 12th Int'l Bled Electronic Commerce Conf., Slovenia, June 7 - 9, 1999, at http://www.rogerclarke.com/EC/WillPay.html

Clarke R. (2002) 'The Birth of Web Commerce' Xamax Consultancy Pty Ltd, October 2002, at http://www.rogerclarke.com/II/WCBirth.html

Clarke R. (2004) 'Origins and Nature of the Internet in Australia' Xamax Consultancy Pty Ltd, January 2004, at http://www.rogerclarke.com/II/OzI04.html

Clarke R. (2006) 'P2P's Significance for eBusiness: Towards a Research Agenda' Journal of Theoretical and Applied Electronic Commerce Research 1, 3 (November 2006), at http://www.jtaer.com/portada.php?agno=2006&numero=3#

Correa P., Correa A. & Askanas M. (2006) 'Wikipedia: A Techno-Cult of Ignorance' Akronos Publishing, 2006, at http://www.aetherometry.com/antiwikipedia2/awp2_index.html

Cringeley R.X. (2005) 'Stop the Presses!: How Pay-Per-Click Is Killing the Traditional Publishing Industry' Pulpit, 29 December 2005, at http://www.pbs.org/cringely/pulpit/2005/pulpit_20051229_000475.html

Denning P., Horning J., Parnas D. & Weinstein L. (2005) 'Wikipedia Risks' Inside Risks 186, Communications of the ACM 48, 12 (December 2005), at http://www.csl.sri.com/users/neumann/insiderisks05.html

Dvorak J.C. (2007) 'Bubble 2.0 Coming Soon' PC Magazine, 1 August 2007, at http://www.pcmag.com/article2/0%2C1895%2C2164136%2C00.asp, accessed 23 September 2007

Garrett J.J. (2005) 'Ajax: A New Approach to Web Applications' Adaptive Path, 18 February 2006, at http://www.adaptivepath.com/publications/essays/archives/000385.php

Hagel J. & Armstrong A.G. (1997) 'Net Gain : Expanding Markets Through Virtual Communities', Harvard Business School, March 1997

Harris W. (2006) 'Why Web 2.0 will end your privacy' bit-tech.net, 3 June 2006, at http://www.bit-tech.net/columns/2006/06/03/web_2_privacy/

Hoffman D.L. & Novak T.P. (1997) 'A New Marketing Paradigm for Electronic Commerce', The Information Society, Special Issue on Electronic Commerce, 13 (Jan-Mar.), 43-54

Kelly K. (1998) 'New Rules for the New Economy: 10 Radical Strategies for a Connected World' Penguin 1998, online version at http://www.kk.org/newrules/contents.php

Kelly K. (2005) 'We Are the Web' Wired 13.08 (August 2005), at http://www.wired.com/wired/archive/13.08/tech_pr.html

Kelly K. & Wolf T. (1997) 'Push! Kiss your browser goodbye: The radical future of media beyond the Web', Wired 5.03 (March 1997), at http://www.wired.com/wired/archive/5.03/ff_push_pr.html

Larsen E. (1992) 'The Naked Consumer: How Our Private Lives Become Public Commodities' Henry Holt and Company, New York, 1992

Lessig L. (2001) 'The future of ideas: The fate of the commons in a connected world' Random House, 2001

Lessig L. (2004) 'Free Culture: How Big Media Uses Technology and the Law to Lock Down Culture and Control Creativity' The Penguin Press, 2004

Levine R., Locke C., Searls D. & Weinberger D. (1999) 'The Cluetrain Manifesto: The End of Business As Usual' Perseus Books, 1999

Levy S. & Stone B. (2006) 'The New Wisdom of the Web' Newsweek, 3 April 2006, at http://www.msnbc.msn.com/id/12015774/site/newsweek/

Lindahl C. & Blount E. (2003) 'Weblogs: simplifying web publishing' Computer 36, 11 (November 2003) 114- 116

MacManus R. & Porter J. (2005) 'Web 2.0 for Designers' Digital Web Magazine, May 4, 2005, at http://www.digital-web.com/articles/web_2_for_designers/

Madden M. & Fox S. (2006) 'Riding the Waves of 'Web 2.0'' , Pew Internet Project, 5 October 2006, at http://www.pewinternet.org/pdfs/PIP_Web_2.0.pdf

Markoff J. (2005) 'Marrying Maps to Data for a New Web Service' The New York Times, 'July 18, 2005, at http://www.nytimes.com/2005/07/18/technology/18maps.html

Meeker M. (2005) 'Internet Trends ' Proc. Conf. Web 2.0, October 2005, at http://www.web2con.com/presentations/web2con05/meeker_mary.pdf

Millard D.E. & Ross M. (2006) 'Web 2.0: hypertext by any other name?' Proc. 17th Conf. on Hypertext and Hypermedia , 2006, pp. 27-30

Miller P. (2005) 'Web 2.0: Building the Web Library' Ariadne 45 (October 2005), at http://www.ariadne.ac.uk/issue45/miller/NorNor

Nee E. (1998) 'Welcome to my store ' Forbes, October 19, 1998, at http://www.forbes.com/forbes/98/1019/6209140a.htm

O'Reilly T. (2005a) 'Ten - No, Eleven - Years of Internet Advertising' O'Reilly Radar, 30 April 2005, at http://radar.oreilly.com/archives/2005/04/tenno_elevenyea.html

O'Reilly T. (2005b) 'What Is Web 2.0? Design Patterns and Business Models for the Next Generation of Software' O'Reilly 30 September 2005, at http://www.oreillynet.com/lpt/a/6228

Packard V. (1957) 'The Hidden Persuaders' Penguin, London, 1957

Paul R. (2007) 'Security experts warn about the risks of 'Premature Ajax-ulation'' Ars Technica, 2 August 2007, at http://arstechnica.com/news.ars/post/20070802-security-experts-warn-developers-about-the-risks-of-premature-ajax-ulation.html, accessed 23 September 2007

Peppers D. & Rogers M. (1993) 'The One to One Future : Building Relationships One Customer at a Time', Doubleday, 1993

Pine, B. (1993) 'Mass Customization', Harvard Business School, 1993

RFC2964 (2000) 'Use of HTTP State Management' Internet Engineering Task Force, October 2000, at http://www.ietf.org/rfc/rfc2964.txt

RFC2965 (2000) 'HTTP State Management Mechanism' Internet Engineering Task Force, October 2000, at http://www.ietf.org/rfc/rfc2965.txt

Schraefel M.C., Smith D.A., Russell A. & Wilson M.L. (2006) 'Semantic Web meets Web 2.0 (and vice versa): The Value of the Mundane for the Semantic Web' Department of Electronics and Computer Science, University of Southampton, May 2006, Preprint at http://eprints.ecs.soton.ac.uk/12665/01/iswc06-web2pointOmc.pdf

Schrage M. (1994) 'Is Advertising Dead?' Wired 2.02 (February 1994), at http://www.wired.com/wired/archive/2.02/advertising_pr.html

Sifry D. (2006) 'State of the Blogosphere - October 2006' Technorati, November 6, 2006, at http://technorati.com/weblog/2006/11/161.html

Thornberry K. (1998) 'Web search engines get personal: Start pages deliver info to Internet users', Business Courier, 19 October 1998, at http://www.amcity.com/cincinnati/stories/101998/focus7.html

Toffler A. (1980) 'The Third Wave' Collins, 1980

Vossen G. & Hagemann S. (2007) 'Unleashing Web 2.0: From Concepts to Creativity' Morgan Kaufmann, 2007

Wolfe G. (1994) 'The (Second Phase of the) Revolution Has Begun' Wired 2.10 (October 1994), at http://www.wired.com/wired/archive/2.10/mosaic.html

Yourdon E. (2006) 'Creating Business Value with Web 2.0' Cutter IT Journal 19, 10 (October 2006)


Acknowledgements

Versions of this paper were presented at the University of Koblenz in May 2007 and at the Australian National University in July 2007. The feedback provided by participants in those seminars, by members of the Link Institute, and by the referees and editor of JTAER, was instrumental in clarifying the analysis.


Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in the Cyberspace Law & Policy Centre at the University of N.S.W., a Visiting Professor in the E-Commerce Programme at the University of Hong Kong, and a Visiting Professor in the Department of Computer Science at the Australian National University.



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 17 November 2006 - Last Amended: 24 September 200 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/EC/Web2C.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy