My first exposure to the world-wide web just a little over two years ago was on an obscure internet discussion forum -- alt.hypertext. Today, folks discover the web through Time, Newsweek, and the Wall Street Journal. And when I saw that Burlington Coat Factory had a storefront on the web, I realized that it's no longer just a cool "net.project" -- it's a way of doing business. It's becoming consumer technology
That's what brings us here today -- the promise of a revolutionary new consumer technology. The web is undisputably the hottest technology trend today. But will it last?
Technology trends are like stars -- some never get past the vapor stage. Some grow too fast -- they go supernova and end up as white dwarves -- nice markets -- or black holes -- a danger to anything near them. But my view is that the world-wide web will have a long, healthy life as a pervasive technology. The marriage of distributed hypermedia and the decentralized networking infrastructure of the Internet is evidently just what the times are calling for.
Obviously a lot of people are using the net and the web today. But a whole lot more are sitting on the side of the pool, watching the trade rags, testing the water, and trying to decide if and when to jump in.
In high-tech markets, the web is already cost-effective. Hewlet Packard actually reduced support costs and increased customer satisfaction by delivering more information via the web and less by telephone.
Other web markets are not so mature today. But they're all growing. Various measurements of the size of the Internet and its markets may be all over the scale, but they all show the same trend of exponential growth. Smart business folks realize that even though the web may not be cost-effective today, the cost of playing catch-up tomorrow might kill them.
And clearly, there are large market segments where the producers and the consumers are sitting on opposite sides of a technology gap. They can't find each other in the vastness of the global information space. They can't exchange payments securely and reliably. The data formats limit the expressive capability of the information providers. And in this age of instant gratification, nobody wants to wait for information once they've found it.
This market demand for better web technology has not gone unnoticed. Enter Spyglass. Spry. Netscape. O'Reilly, EIT. And on their heels come IBM, Novell, Microsoft, Lotus, and MCI. Not to mention the legion of consultants, access providers, information providers, digital librarians and editors, and support organizations. And don't forget the internet software development community that brought you Mosaic, USENET News, Internet Relay Chat, and the other ubiquitous applications on the internet.
Believe it or not, that "free software" community is a stabilizing influence on this market frenzy: one thing that draws information providers to the web is the tremendous size of the audience. Depending on any technology that's not royalty-free severely limits the audience.
The result is that while these companies can add value to the web by offering stability, support, and custom applications, it would be self-destructive for them to "splinter off" by failing to interoperate with the mainstream web.
So how do vendors differentiate themselves? Where does innovation fit it? After all, growth of the market depends on confidence in the technology which comes from a blend of the promise of an upgrade path with a proven track record of reliability.
This is the crucial role of interface specifications -- specifications of how various parts of the whole system operate. One way to look at these specifications is to say that they divide the world of all possible behaviours into mandatory, optional, and forbidden behaviours.
For instance, let's take the classic example of the interface between a driver an a car. A spec might say that a car must have a steering wheel, breaks, and accelerator, and it would specify their location relative to the driver. The car may have a clutch and stick shift. The car may not have the driver's seat behind the passenger seat. And there are certain parameters that are open to individual interpretation, like the location of the headlight control. Moreover, some features of the driver/cockpit interface are completely independent of the basic operation of the car -- the stereo controls, for example.
The same is true of web software: some browsers support images. Some don't. Some servers support full-text searching. Some don't. And some offer a "hook" like the CGI interface where searching and other features can be added as an "after-market" option.
We are just now to the point where we have enough experience and shared understanding of the interaction between web software components to submit specifcations to a formal standardization process. There are a few key characteristics of successful specifications that I'd like to discuss, and a few key properties of a standards process that has a proven record of producing them.
First, a specfication must be complete to be successful. If significant aspects are left unspecified, then there is a possibility that independent projects or products will vary in their implementation of those aspects. And Murphy's law says that possibility should be considered a certainty. The result is that two implementations that adhere to everything in the spec do not interoperate. That's pretty much the definition of a specification failure.
On the other hand, you have to be careful not to overspecify an interface. It's a little bit annoying, sometimes, the way different cars have different ways to honk the horn. But if the location of the horn were limited to the traditional middle-of-the-steering-wheel position, where would we put a driver's-side air bag? And I personally think putting the stereo controls in the steering wheel is the best idea since the lightbulb. So we see that minimal specification is key to extensibility and growth.
The last characteristic I'd like to emphasize is modularity. If you can break a large, complex system into two or more smaller, simpler systems, that's the way to go. That way, you can replace one of them in the future without starting from scratch on the others. The HTTP protocol, the HTML data format, and the URL addressing scheme are modular parts of the web technology, for example.
In fact, each of those aspects of the web technology is being standardized somewhat independently. You'll hear more about the HTML, HTTP, and URI working groups in Dave Raggett's presentation on the state of web standards. But I'd like to discuss the Internet Engineering Task Force and the IETF standards process, because it has a proven track-record of creating specification that work.
Standardizing specifications is really just the last step in the overall IETF technology deployment life cycle. First, an idea is proposed, perhaps to a working group chair and then to the group, if it seems appropriate. The proposal is batted around, reviewed, enhanced, or maybe trimmed down. Then the proposal is distributed as an internet draft, perhaps more than once due to review comments. But they don't write the result in stone just because they believe it looks finished. During this review process, members of the group are busy gaining real-world experience by implementing and testing the proposal. Once there are two independent implementations and the working group reaches consensus, the proposal is archived as an RFC -- a request for comments. If it stands the test of some more time, it may become and Internet Standard.
The keys to success in this process are an open process of consensus building, and implementaiton experience concurrent with standardization.
If you think that this seems like a long, tedious process for rolling out new technology, you're not far off. But remember: the target for this effort is lasting, shared technology.
If you want to deploy something new today, then you might be able to skip all that and get right to it. You just have to make sure that the feature you're after can be deployed in your application domain without causing interoperability problems with other domains. This can be a tricky task, given the volatile state of web specs today.
But for example, look at the netscape extensions to HTML. Netscape is catching a certain amount of flak for not sumitting a proposal for public review before deploying them. But I believe they made an honest effor to investigate and avoid interoperability problems. If you add, say, a <blink> tag to a document, it doesn't cause mosaic or any other browsers to behave strangely. So while the netscape extensions violate the letter of the current HTML spec, they do not viloate the spec in spirit.
As a counterexample, we can look back to the introduction of forms in HTML. Information providers that wanted to use forms had to include disclaimers like "look out! If you don't have Mosaic 2.0 or some other forms-capable browser, this page will look funky." There are mechanisms in the protocol that could have been used to let the software figure that out without manual intervention.
It's one thing to add features to the system and encourage users to upgrade to software that supports them. It's quite another to carelessly deploy features that makes the installed base of implementations look broken. That forces users to upgrade, and destroys confidence in the technology. Anyone looking at the web as a basis for mission-critical applications will
be watching closely to be sure that enhancements are gracefully deployed. If they see the rules broken too many times, they'll just have to find some other way to get their job done.
So what will be the ultimate fate of the netscape extensions? Will they become standard? I don't know. Some probably will, some probably won't, and some will likely be adopted in modified form. All that will be decided over time in the HTML working group. If you have an interest in seeing it go one way or another, that's the place to make your case. IETF working groups are open to all comers.
(I haven't keyed in the rest of my notes yet.)Daniel W. Connolly