Infrequently Noted

Alex Russell on browsers, standards, and the process of progress.

Why JavaScript?

One strain of objection I often hear about the project of making the web more extensible is that it implies travelling further down the JavaScript rabbit hole. The arguments often include:

These, incidentally, are mirrors to the fears that many have about the web becoming "too reliant" on JavaScript. But that's a topic for another post.

Lets examine these in turn.

The question of what languages a platform admits as first-class isn't about the languages -- not really, anyway. It's about the conventions of the lowest observable level of abstraction. We have many languages today that cooperate at runtime on "classical" platforms (Windows/Linux/OSX) and the JVM because they collaborate on low-level machine operations. In the C-ish OSes, that's about moving words around memory area and using particular calling conventions for structuring input and outputs to kernel API thunks. Above that it's all convention; see COM. Similarly, JVM languages interop at the level of JVM bytecode.

The operational semantics of these platforms are incredibly low level. The flagship languages and most of the runtime behavior of programs are built up from these very low-level contracts. Where interop happens at an API level, it's usually about a large-ish standard library which obeys most of the same calling conventions (even if its implementation is radically different).

The web has it the other way around. It achieved broad compatibility by starting the bidding at extremely high level semantics which, initially, had very little in the way of a contract beyond bugwards compatibility with whatever Netscape or MSFT shipped last. The coarse, interpret-it-as-you-go contract of HTML is one of the things that has made it such a hardy survivor. JavaScript was added later, and while it has lower-level operational semantics than HTML or CSS, that history of bolting JS on later has led to the current project of encouraging extensibility and layering; e.g., through Web Components. It's also why those who cargo-cult their experiences of other platforms onto the web find themselves adrift. There just isn't a shared lower level on which to interoperate.

That there aren't other languages interfacing with the web successfully today is, in part, the natural outcome of a lack of shared lower-level idioms on which those languages could build-up runtimes on. It's no accident that CoffeeScript, TypeScript, and even Dart find themselves running mostly on top of JS VMs. There's no lower level in the platform to contemplate.

Which brings us to the second argument: there are other, better languages...surely we could just all agree on some bytecode format for the web that would allow everyone to get along...right?

This is possible, but implausible.

Implausibility is the only reason I pour time and effort into trying to improve JS and not something else. The Nash Equilibrium of the web gives rise to predicable plays: assuming that incentives for adopting low-level descriptions of JS (as any such bytecode would have to describe JS as well as everything else) are not evenly distributed, movement by any group that is not all of the competitors stymies compatibility, which after all is the whole goal. Any language that wishes to interoperate with JavaScript and the existing DOM is best off describing its runtime in terms of JavaScript for fear that the threat to not adopting a compatible bytecode is credible. Compatibility strategies that straddle the fence can work, but it's not a short (or clear) game to play. And introducing an abstraction that's not fundamentally lower-level than JS (and/or does not fully subsume its semantics) is simply doomed. It would lack the power to even credibly hold out hope for a compatible future.

So, yes, there are better languages. Yes, you could put them in a browser. But unless you possess the power to put them in every browser, they don't matter unless their operational semantics are 1:1 with JavaScript.

You can see how I ended up on TC39. It's not that I think JS is great (it has well-documented flashes of genius, but so does any competitor worth mentioning) or even the perfect language for the web. But it is the *one language that every vendor is committed to shipping compatibly*. Evolving JS has the leverage to add/change the semantics of the platform in a way that no other strategy credibly can, IMO.

This leaves us with the last objection: JS doesn't fully describe everything in the web platform, so why not recant and switch horses before it's too late to turn back?

This misreads platforms vs. runtimes. All successful platform have privileged APIs and behaviors. Successful, generative platforms merely reduce the surface area of this magic and ensure that privileged APIs "blend in" well -- no funky calling conventions, no alien semantics, etc. Truly great platforms leave developers thinking they're the only ship in the entire ocean and that it is a uniform depth the whole way across. It's hard to think of a description any more at odds with the web platform. Having acknowledged the necessity and ubiquity of privileged APIs, the framing is now right to ask: what can be done about it?

I've made it my work for the past 3+ years -- along with a growing troupe of fellow thinkers -- to answer this charge by reducing the scope and necessity of magic in everyday web development. To describe how something high-level in the platform works in terms of JS isn't to deny some other language a fair shot or to stretch too far with JS, it's simply to fill in the obvious gaps by asking the question "how are these bits connected?"

Those connections and that archeological dig are what are most likely to turn up the sort of extensible, layered, compatible web platform that shares core semantics across languages. You can imagine other ways of doing it, but I don't think you can get there from here. And the possible is all that matters.

That Old-Skool Smell, Part 2

The last post covered a few of the ways that the W3C isn't effective facilitating the discussions that lead to new standards work and, more generally, how trying to participate feels as though you are being transported back to a slower, more mediated era.

Which brings up a couple of things I've noticed across the W3C and which can likely be fixed more quickly. But some background first: due to W3C rules, it's hard to schedule meetings (usually conference calls) quickly. You often need 2 weeks notice for it to happen under a W3C-condoned WG, but canceling meetings is, as we all know, much easier. As a result, many groups set up weekly or bi-weekly meetings but, in practice, meet much less frequently. This lightens the burden for those participating heavily in one or two topics, but leaves occasional participants and those trying to engage from non-majority time-zones at a serious dis-advantage because the notice of meeting cancellation is near-universally handled via mailing list messages.

Yes, you read that right, the W3C uses mailing lists to manage meeting notices. In 2013. And there is no uniformity across groups.

Thanks to Peter Linss, the TAG is doing better: there's an ical feed for all of our upcoming meetings that anyone can subscribe to. Yes, notices are still sent to the list, but you no longer need to dig through email to attempt to find out if the regularly-scheduled meeting is going to happen. Wonder of wonders, I can just look at my calendar...at least when it comes to the TAG.

That this is new says, to my mind, everything you need to know about how the current structure of the W3C's spending on technical infrastructure and staff has gone unchallenged for far, far too long. The TAG is likewise starting to make a move from CVS to Git...and once again it finds itself at the vanguard of organizational practice. That here has been no organization-wide attempt to get WGs to move to more productive tools is, to me, an indicator of how many in positions of authority (if not power) at the WG and on the Staff think things are going. That this state of affairs isn't prima-facia evidence of the need for urgent change and modernization says volumes. As usual, it's not about the tools, but about the way the tools help the organization meet (or fail to meet) its goals. Right now, "better" looks like what nearly every member organization's software teams are already doing. Modernizing in this environment will be a relief, not a burden.

It's also sort of shocking to find that there are no dashboards. Anywhere. For anything -- at least not ones that I can find.

No progress or status dashboard to give the organization a sense for what's currently happening, no dashboard to show charter and publication milestones across groups, no visible indicators about which groups are highly active and which are fading away.

If the W3C has an optics problem -- and I submit that it does -- it's not doing itself any favors by burying the evidence of its overall trajectory and in arcane mailing lists.

There is, at base, a question raised by this and many other aspects of W3C practice: how can the organization be seen to be a good steward of member time, attention, and resources when it does not seem to pay much mind to the state of the workshop. I'd be delighted to see W3C staff liasons for WGs working to make products visible, easy to engage with, and efficient to contribute to as their primary objective. As it is, I don't sense that's their role. And that's just not great customer service.. I hope I'm wrong, or I hope that changes.

Older Posts

Newer Posts