Infrequently Noted

Alex Russell on browsers, standards, and the process of progress.

The Strangely Charged ES4 "Debate"

Anyone with an RSS reader and a passing interest in languages or browser technology has by now caught a whiff of the strange odor coming from the ES4 vs. ES3(.1) debate. Before I dig in, I should mention by way of disclaimer that I'm an invited "expert" (har!) to the TG1 working group. Take the rest of this post with the appropriately sized dose of granulated sodium.

A lot of dirty laundry is getting aired and I'm afraid I haven't been immune from hyperbole and inaccuracies in the process, so charged and furious is the debate. There's been procedural wrangling, odd missives, emphatic rebuttals, painful comment threads and expositions on theories of how the web should evolve. Even a bit of popcorn sellin...er...press coverage.

We're starting to miss the plot a bit.

Wrapped up in the question of "will Microsoft implement ES4?" is implicitly "what will Microsoft implement? when?". Answers to this question aren't about features so much as they are about trust. Reflected in the comments of these threads seem to me to be two primary voices, roughly paraphrased as:

We've seen both of these voices before, particularly in regards to W3C activities. Some of my previous comments on that topic made the debate louder without (I'm afraid) providing much perspective about how the community can effectively advocate for fixes. What's good about the ES4 debate (in contrast w/ my CSS ramblings) is that they are scoped to particular implementers. Insofar as advocacy makes a difference, lobbying a working group which is moribund is nearly useless. They have authority but no power. Power in these situations always lies with implementers, aka browser vendors. The CSS discussions are weird and airy because there's no real skin in the game. Swatting at the W3C is just so much shadow boxing. Asking browser vendors to take risks, to show some good old-fashioned technical leadership and put some of their theories about how to evolve into their competing implementations....now that's got legs.

The first perspective, I think, is borne of a fear of loss mixed with a big pinch of "IE4 vs. NN4 sucked" (queue involuntary twitching). The web won. The web is strong. The web's strength is due in large part to the fact that (as never before) it "is" a single thing in the minds of developers. I'm incredibly sympathetic to this viewpoint, and the argument that major changes in HTML and other fundamental web technologies creates opportunity for closed technologies to sell the "but we have a uniform deployment environment" argument does hold some water. But not enough.

Closed-platform vendors have always been selling that advantage, in part because it may be their only sustainable advantage over mass ubiquity and real competition. What advocates of the "go slow" position rightly are pointing out is that when faced with what to do next, the pace of change suggests that the deployed browser population is set in stone. It has always looked this way, though. From the days when Netscape 3.2 kept us from using DHTML in any real form to the years when NN4 just couldn't die fast enough all the way through the new deadpool taking bets around the demise of IE 6, progress on the web has always been gated by deployed systems. Yet somehow we've got most of HTML 4, CSS 2, and ES3 in a semi-portable form. Clearly the web is robust enough to handle moderate doses of divergent implementations. Progress neither asks for permission nor arrives with a press release. It's made one browser upgrade or replacement at a time.

Both of the perspectives on where to go from here are veiled ways of asking the question "who can make it better for me?". It turns out that the Ajax toolkits matter (if they matter at all) because they can give us what we want without waiting for browser vendors to field implementations; for some value of "what we want". That might seem like the future, then. Just ask your friendly Ajax toolkit vendor for something you need and wait for it to be implemented. Would that it were so simple.

Ajax toolkits (like Dojo) represent "cheap" agreement. Instead of the browsers duking it out on features and subsequently unifying APIs at a standards body, toolkits can paper over differences to some extent and provide a unified front, but the cost to doing so is massive. Not only do they need to be shipped down the wire (a constraint that mangles their design every step of the way), they introduce their own bugs, weirdisms, and inefficiencies. In a functioning marketplace, the feature set of ajax toolkits would be changing very quickly. As new versions of all browsers are released, the toolkits would keep code only so long as old versions of browsers for which they bridged functionality were in play. The existing features would be gone in a year or two, replaced by native APIs and tags, and new features would be added to help lessen the blow as the browser makers attempted to figure out who was going to "win" on a particular way of implementing this or that. In any event, it should be obvious with even cursory introspection that the toolkits can only approach so much functionality at once (until they're baked in or preferentially cached by the browsers, that is), and that functionality can really only cover basic capabilities which are ubiquitous or are so close that a bit of script can push it over the edge. We may always have the toolkits, but arguments that somehow they'll dig us out of this hole don't stand up to inspection. We toolkit authors are combing the beach of features from the various browsers, desperately trying to construct a full set of shiny shells of any type. Small irregularities we can polish out, but if one thing is square and the other is round, there's just no way we can make one into the other without spending more time, effort, and money than it's worth.

So now we've got the shape of a solution to the problem: toolkits can act as our canny observers of ubiquity, constantly combing for ways to get us to a robust baseline, iterating faster than the entire ecosystem might otherwise allow. But we've been over this beach a lot already. We're out of shiny shells that look similar, and have been for a long while. At this point we're reduced to selling our own compositions instead of looking for the natural beauty which attracted us to this beach in the first place. We need a good storm. We need new features to mine, and we need to be able to drop all the code which we're monkey-patching old versions of IE with.

To be bleedingly obvious about it, we need the browser vendors to ship. Not once, but over and over again, on a predictable schedule, and with some actual competition and risk taking. Hiding behind W3C (or even ECMA) committees instead of shipping new stuff is a tactic which the web community at large simply won't accept any more.

New browsers (aka, new features) are the only way we get new shells on this beach. Sure, some of them are ugly. They'll get left, and in time, washed back out to sea. So be it. If we get shiny, pretty new ones tomorrow, the ugly ones don't matter. But we have to be able to trust that we will. The idea that the web should be the same tomorrow as it was yesterday is ludicrous. "Don't break the web" simply cannot be code for "if it changes, that means it's broken". It would be laughable on the face of it if it were. It may have been changing imperceptibly of late, but it has always been changing. Likewise, things which make it uninhabitable for large quantities of content can't be passed off as "good" just because they're change.

What that all adds up to is that inciting browser makers to iterative action, spec or not, is the only way we actually get new stuff into the fabric of the web. If we're going to really ride this "open web" thing, we (the development community) need to come to grips with the fact that we're never going to get new features in one fell swoop. We've always gotten them in an N+2 fashion, and that's not going to change, no matter who's got the majority market share today.

What we should not ever settle for, ever again, is for any of the vendors we depend critically on to go radio silent. The web looses not when other platforms are better (they always have been, always will be), but when we lose confidence that the market mechanism of competition is functioning. When we can't count on new features at predictable mile markers, there's no change to treat as opportunity. It doesn't even matter so much how big the changes are. We just need to know that they're coming and will be coming again in the future. When there's no change (aka: opportunity), the web dies a slow, wrenching death as more canny things which can provide the promise of a future that's not the same as the present steal mind share, first for the high end, and then all the way down the spectrum.

So, as I've been saying in my public talks of late, demand a public plan from the browser vendors. All of them. Asking for standards conformance isn't an answer, but combined with asking for a commitment to future versions it is the basis for resuscitating this market. We can start with: "When will your next beta or alpha be available? What about the version after that? Is your organization standardizing the new stuff you added in the last stable release? Where?"

Those questions don't speak to features, the speak to futures. Features without a future don't matter.