A Pre-History of Web Politics

A Personal View

Phillip Hallam-Baker

"[A] Trojan horse carrying enemy soldiers in its belly"
Ayatollah Shahroudi[1]

The Web was from the start a political project. The Web was not invented to change the world, but it was intended to change the way the world works. The idea that everyone has the right to be a publisher is built deep into the fabric of the Web.

There is a tendency of politicians and observers of the political scene to act as if they believe that only the events that happen in London or Moscow or Washington D.C. or another capital of a world power are 'real politics'. Technology recognizes no such boundaries: the Web disrupts power relationships in institutions in exactly the same way that it has disrupted power relationships in government and politics. The disruption is intentional and stems directly from the design philosophy of the Web.

The CERN Telephone Book

As the story is told by Sir Tim Berners-Lee, the history of the Web as phenomenon begins with the story of the CERN telephone book. To reach critical mass within CERN the Web needed a killer application to drive deployment. Access to the CERN telephone directory provided that killer application.

As is frequently the case it is the unasked question that is the most interesting one: why was there an unmet demand for network access to the telephone book?

The answer to this question will be familiar to anyone who observed the slow rearguard action fought by the mainframe computing establishment against the invasion of mini and micro-computers. Although CERN had a large infrastructure of mini-computers and personal workstations, the CERN-VM was the maypole around which the computing establishment expected everyone to dance.

As with many mainframe installations CERN-VM was for most of its users an unwelcome legacy of decades past with an archaic operating system that had been heavily customized over the years by CERN Computers and Networks division. As a result CERN-VM was not only an unpleasant learning experience, users knew that the skills they gained would not transfer to any other machine. Despite the fact that the machine itself was new, the performance was unimpressive even before the fact that use was shared with several hundred other users. On physics analysis code the machine was rather slower than a workstation class machine costing two orders of magnitude less.

Not surprisingly, many researchers at CERN avoided CERN-VM whenever possible. Others had spent so much time and effort learning how to use the illogical and decrepit CERN-VM that they saw no reason to learn anything new before their retirement. This view was particularly apparent amongst the management who had ways to make avoiding CERN-VM very difficult indeed. In particular it was the sole repository for much of the information people needed to do their day-to-day work such as the phone book, lists of meetings, physics data and so on. With the right knowledge it was possible to obtain some, but not all of the information from other machines. But in the days before the Web obtaining that type of knowledge was particularly difficult.

Another interestingly unasked question is why the first machine on the Web was a NextCube. Here the official story gives the cause but not the reason: after working on a CERN project which to implement the Remote Procedure Call (RPC) protocol Tim Berners-Lee was allowed to spend some time looking at the recently launched machine from Next Computer and evaluate its potential for physics analysis.

It is hard to imagine a sharper contrast between the NextStep user experience and CERN-VM. Like the Xerox Alto computer before it, the Next machine was so far ahead of its time as to be inevitably doomed to failure. It was as if DVD had been entered as a contender into the 1980's video format war between Betamax and VHS[2]. It took the computing industry more than a decade to catch up with Mac OS-X, Windows XP and the maturity of practical object oriented programming environments such as Java and C#.

Tim realized that physicists at CERN should be using a machine that looked more like the Next machine than CERN-VM and became its apostle. More importantly he understood that before physicists would consider it an acceptable replacement they had to be able to access the information stored on CERN-VM. Removing an inconvenience for the physicist had become an urgent, time-critical need.

The Web broke the CERN-VM monopoly of data. Three years after the public launch of the Web the machine was downsized and officially 'frozen'[3].

More importantly the Web broke the monopoly using an architecture that was scalable, based on open standards and put practical needs before hypertext ideology. The Web didn't guarantee links would always work and it ignored the obsession with copyright enforcement that had doomed the work of Ted Nelson and many others. The Web was 'scruffy' but it worked and it was available for free.

The Electronic Town Hall

My personal involvement in the Web began when I met Tim at the Computing in High Energy Physics Conference in Annecy in September 1992[4]. Two days after returning I started what was to the best of my knowledge the first political site on the Web, providing materials from the candidates campaigning in the 1992 US Presidential election.

At the time the number of Web sites was about a hundred and the number of Web users more or less the same. The Web attracted publishers rather than readers for the simple reason that there really was not very much to read. Apart from the CERN telephone book and the specifications of the Web itself, content was sparse. It was possible to visit practically every site on the connected Web in an evening.

The idea of the Web as medium for political action in the autumn of 1992 was a natural one for anyone who had been active in UK politics at the time.

Earlier that year the British general election had been won unexpectedly by John Major's Conservative party. By any political calculation it was an election that the party should have lost. After more than a decade in government the party had been forced to depose Margaret Thatcher as leader or face an electoral rout. Instead Conservatives managed to scrape a narrow victory with the support of the Murdoch press which proudly proclaimed ‘It’s the Sun Wot Won it!

Few doubted the accuracy of Murdoch’s claim to have decided the outcome. The 1992 general election was one of the closest in UK history. For the next five years John Major would struggle to keep together a government that its own Chancellor would describe as ‘in office but not in power’. In 1997 Murdoch switched his allegiance to the opposition Labour party, helping to ensure that the Conservative party, the natural party of British government during the 20th Century, would fight three successive general elections with no realistic prospect of victory.

The role of the Press Barons in UK politics had long been controversial, wielding what Tory Prime Minister Stanley Baldwin memorably described as "Power without responsibility - the prerogative of the harlot throughout the ages."

The power of the press barons to set the political agenda and advance their own interests against those of the country could no longer be tolerated. The Web empowered every user of the Internet as a publisher. It was time to put it to work.

The Web could not hope to replace the established media but it could provide a feedback loop. As a control engineer I knew that the properties of any system are determined by its feedback loops.

The satirical magazine Private Eye demonstrated how a small independent paper could exert an influence out of all proportion to its readership by challenging the claims of the establishment and its press. In particular the Eye had for many years been one of the few voices to challenge the activities of Labour Press Baron Robert Maxwell. The Eye’s claims were vindicated after Maxwell’s death when it was discovered that Maxwell had perpetrated a massive fraud on his employee’s pension funds.

At the time it did not seem possible that the Web could quickly replace the establishment media but it could provide a check on press abuse.

The Electronic Town Hall was not much by the standards of modern political Web sites but it did have the platforms of all the candidates on the ballot in the 1992 US general election. It had more than fifty visitors during the campaign, a considerable achievement given that at the start of the project the number of Web users numbered in the hundreds.

The 1992 US Online Election

Within a few days of meeting Tim I sent an email to Jock Gill, then the manager of the 1992 Clinton-Gore online campaign, advising him that the Web represented the future of US political discourse and that the administration should have a Web site.

Technical support for the Clinton-Gore ’92 online campaign came from the Political Participation Project at the MIT Laboratory for Artificial Intelligence. The lab had for many years been working on using AI techniques to analyze reports of political events. Radical and racist groups were already using the Internet to organize[5] and it was clear that the Internet would play a key role in shaping mainstream political discourse. The best way to know how this would occur was to make it happen.

The Political Participation Project run by John Mallery, Roger Hurwitz and Mark Bonchek, distributed campaign press releases from the campaigns that accepted their services through USENET.

In 1992 discontent with the established media in the US was largely a concern of the right wing. In the wake of the Clinton impeachment and the 2000 election the left has become increasingly concerned. Ultimately however the debate over whether the media has a bias towards the left or the right takes place in a frame that benefits the established media which asserts that if both sides are complaining equally they are probably doing the right thing.

The term disintermediation was already being applied to economic relationships as customers and suppliers eliminated the middle man. Mark Bonchek used disintermediation to describe the effect of the Internet on the information relationship between the campaigns and the voters. From now on the campaigns could talk directly to the voters without the intervention of what George W. Bush later described as the ‘media filter’.

Whitehouse.Gov

Putting the US government ‘online’ was a key commitment of the Clinton administration. The principal challenge being that at the time of the 1993 inaugural there were simply no precedents for what an online government could or should look like.

The administration decided that the White House could only join the Internet after every government department and agency was online. This decision had the important consequence that government agencies such as the CIA and NSA which traditionally functioned in a clandestine manner would be required to have a Web site.

Another consequence of the decision to put the Federal government on the Internet was that security became the gating factor. The problem was explained to me in the following terms. During the 1970s the principal target of a political coup was the television station, once the plotters had control of the means of official government communication they would become the government. When the Web site becomes the official means of government communication control of the Web site becomes a critical asset.

In retrospect it is perhaps not surprising that the CIA and the FBI would be the first government agencies to be embarrassed by having their Web site hacked. Management of the sites was outsourced to protect the confidentiality of the agencies information assets, what the security analysts overlooked was the risk of a reputation attack and the need to protect the integrity of their assets.

The Way Forward

Judged by the original objectives of the Web the only question that might be asked about the emergence of the blogosphere is, 'why did it take so long?' The ideas that emerged in the 2004 US election; disintermediation, Web fund raising, the net roots were all anticipated in1992 during the US election campaign. The Web had more than enough users to make it a potent force in the 2000 election cycle, if not in the previous election.

The reason for this delay should be obvious: politics and governance are social systems, not technological ones. The only reason a twelve year delay is not considered a rapid pace of adoption is that developments in the Web are judged by the nonsensical yardstick of Internet time.

The Web has already had a lasting effect on the US political system. There can be no doubt that the 2008 will be the Web election as 1960 was the television election. At least one frontrunner in the election race will be felled by an embarrassing video on YouTube, or perhaps by some emergent Internet phenomenon such as Twitter that even the most Web savvy have not yet heard of.

Yet measuring the political impact of the Web by its effect on one of the oldest, most stable, liberal democracies is to vastly underestimate its potential. The Web is the printing press of the digital age and Sir Tim Berners-Lee its Gutenberg. Just as the printing press enabled the political processes that ended theocracy in Europe and led to the renaissance and the rise of science, the despots and tyrants of the modern era understand the threat of the Web well enough.

As the Web becomes the world economy a total censorship of the Web amounts to a self imposed economic blockade that is certain to harm the interests of the political elites on which a regime depends for support. Yet allowing access on any level poses an even greater long term threat as opponents are equipped with the tools to organize. In this instance perimeter security is certainly no defense: opposition from within is ten times the threat of an exiled leader without.

If the Web were merely a destructive force, upending the established order without replacing it, Ayatollah Shahroudi's description quoted at the beginning would be more than justified. It is our duty to ensure that the Web creates as well as destroys. In particular it must be the mission of the Web to build the accountable infrastructure and accountable institutions that make democratic government possible. The fuse is already lit, change is now inevitable, the question is whether it will be change for the good or ill and it is the citizens of the Web who must take that choice.



[1] Nasrin Alav, Persian chat, Financial Times, Nov 05, 2005 http://search.ft.com/ftArticle?page=5&queryText=Dion&id=051105001037

[2] The comprehensiveness of the defeat of Betamax may be judged that only twenty years later the word 'Betamax' does not merit inclusion in the Word spelling dictionary.

[3] Harry R. Renshall, CERNVM Status and Plans, http://cnlart.web.cern.ch/cnlart/216/node2.html

[4] Tim Berners-Lee, Robert Cailliau, World Wide Web, http://www.w3.org/Conferences/CHEP92/chep92www.ps.

[5] For example see Mark S. Bonchek, Grassroots in Cyberspace: Using Computer Networks to Facilitate Political Participation , http://www.organizenow.net/techtips/bonchek-grassroots.html