Infrequently Noted

Alex Russell on browsers, standards, and the process of progress.

Inadmissible Arguments

I spend a lot of time working in, on, and around web standards. As a part of this work, several bogus perspectives are continuously deployed to defend preferred solutions. I hereby call bullshit on the following memetic constructs:

"That's just a browser caching problem."

Look for this to be deployed along standards-wonk classics such as "our job should just be to provide low-level primitives", "we don't know enough to solve this", and "people will just write tiny libraries to provide that".

"It's a caching problem" is a common refrain of those who don't build apps and aren't judged on their latency. And it's transparently bullshit in the web context. It relies on you forgetting -- for just that split second while it sounds possible to duck the work at hand -- that we're talking about a platform whose primary use-case is sending content across a narrow, high-latency pipe. If you work in web standards and don't acknowledge this constraint, you're a menace to others.

Recent history alone should be enough to invalidate the caching argument; remember Applets? Yeah, Java had lots of problems on the client side, but what you saw with client-side Java were the assumptions of server-side engineers (who get to control things like their deployment VM versions, their hardware and spindle speeds, etc.) imported to distributed code environments where you weren't pulling from a fast local disk in those critical moments when you first introduced your site/app/whatever to the users. Instead, you saw the piles upon piles of JARs that get created when you assume that that next byte is nearly free, that disk is only 10ms away, and that cold startups are the rare case. It worked out predictably and Java-like systems succeed on the client when their cultural assumptions do align with deployment constraints -- Android and iOS are great examples. Their mediated install processes see to it.

Back out here on the wolly web, caching is something that's under user control. It must be for privacy reasons, and we know from long experience that users clear their caches. The "cache" they can't clear is the baseline set of functionality the platform provides -- i.e., the built in stuff on all runtimes developer cares about...which is what specs can effect (obliquely and with some delay).

By the time you're having a serious discussion about adding a thing to a spec among participants who aren't obvious bozos, you can bet your sweet ass that the reason it was brought up in the first place is a clear and obvious replication of effort among existing users bordering on the ubiquitous -- often in libraries they use. Saying that someone can write a library rises no higher than mere truism, and saying that a standards body shouldn't provide some part of the common features found among those existing libraries because caching will make the cost of those libraries moot is ignorance or standards-jujitsu in an attempt to get a particular proposal shelved for some other reason.

As bad as the above is, consider the (historically prevalent) use of this argument regarding "why we don't need no stinking" markup features -- just "fix" browser caching and "give me a low-level thing and I'll build [rounded-corners, gradients, new layout mechanisms, CSS3d xforms, etc.] myself". The subtle bias towards JavaScript and away from a declarative web is one of the worst, most insidious biases of the already enfranchised upper-class of JavaScript-savvy web developers, perpetuating a two-tier web in which "real engineers" can do better every year but wherein those same expressiveness and performance gains aren't transmitted to folks without CS degrees (yes, I'm looking straight at you, WebGL). You almost want to give it to them, though, so they'll finally come to terms with how wrong they really are about the ability for caching to be "fixed".

We've spent a decade on this -- remember that we're all using CDNs, pulling our JS libraries from edge-cached servers with stable URLs for optimal LRU behavior, running minifiers, setting far-forward expires, etc. And that's just what webdevs are doing: meanwhile browser vendors have been working day and night to increase the sizes of caches, ensure full caches where possible, and implement sometimes crazy specs that promise to help. It's not for lack of trying, but we still don't collectively know how to "fix" caching.

Think libraries are free? Show me how you'll make 'em free and I'll start taking you seriously. Until then, bozo bit flipped.

"It should have exactly the same API as library X"

Similar arguments include "it must be pollyfillable", etc.

Why is this bullshit? Not because it represents an aversion to being caught in an implementation/deployment dead zone -- that's a serious concern. Nobody wants a better world dangling out there just beyond reach, waiting for Vendor X to pick it up and implement or for Version Y of Old Browser to die so that 90% of your clients can acces the feature. That's where libraries provide great value and will continue to, no matter what the eventual feature looks like. Remember, the dominant libraries in use by developers don't have Firefox, Chrome, IE, and Opera-specific versions that you serve up based on client. The platonic ideal is 180-degrees in the other direction: one code base, feature detection, polyfills, progressive enhancement -- basically anything within (and often well outside of) reason to keep from serving differential content. So your library is going to have all those versions anyway until all of your clients have the new-spec version. Optimizing based on existing API design because "now we can polyfill it without extra effort!" misunderstands both the role of spec designers and of libraries and serves users very poorly indeed.

Where this argument becomes truly inadmissible, though, is when it seeks to define the problem to be "what our API already does". Turns out that the language you can design features with when all you have is JavaScript and HTML/CSS patterns is...well...the JS, HTML, and CSS you have today. Powerful yes. Expressive? Hrm. If your job, however, is to evolve an integrated platform, taking on a constraint predicated on the current form of your systems is nuts. There are some hard constraints like that (backwards compatibility), but adopting some form of hack as the "blessed" way of doing something without looking around and going "how can we do this better given that we have more powerful and expressive language to solve the problem with?" is nothing but lost opportunity.

Yes, a standards group might look around and go "nope, we can't do better than that polyfill/library" and adopt the solution wholesale. That's not a bad thing. What is bad, though, is advocacy about far-future solutions to current problems based solely on the idea that some library totally nailed it and we shouldn't be asking for anything more -- e.g.: Microdata cribbed from RDFa and Microformats when what you really wanted was Web Components. Rote recital of existing practice is predictably weak sauce that robs the platform of its generative spark. We should all be hoping for enhancements that make the future web stronger than today's, and you only find those opportunities by taking off the blinders and giving yourself free reign to do things that libraries, polyfills, and hack just can't.

What to do about the developers and users caught in the crossfire? Well, we can advocate for degradable, polyfill-friendly designs, but there are limits to that. The most hopeful, powerful way to make the pain disappear is to ensure that more of our users year-over-year are on browsers/platforms that auto-update and won't ever be structurally left-behind again. And yes, that's something that every web developer should be working towards; prompting users to update to current-version auto-upgrading -- "evergreen", if you will -- browsers. Remember: you get the web you ask your users for.

"Just tell us the use cases"

This is the one I'm most sick of hearing from the mouths of standards wonks and authors. What they're trying to say is some well-meaning combination of:

What it often winds up doing, however, is serving as a way for a standards body or author to shut up unproductive folks; folks who aren't willing to do the work of helping them come to enlightenment about the architecture of the current solution, the constraints it was built under, and the deficiencies it leaves you with. Or it can be used to avoid that process entirely -- which is where we're getting towards bullshit territory. Taken to the extreme (and it too often is) the "use-cases, not details" approach infantilizes standards body participants, setting a bar too high for the folks who are crying out for help and setting the bar too low for the standards body because, inevitably, the tortured text of the "use cases" is an over-terse substitue for a process, a way of building, and an architectural approach. Use-cases (as you see them for HTML and CSS in particular) become architecture-free statements, no more informative than a line plucked at random from Henry V. Suggestive, yes. Useful? Nope. The very idea of building standards without a shared concept of architecture is bogus, and standards participants need to stop hiding behind this language. If you don't understand something, say so. If many people work hard to explain it and can't, ask for a sample application or snippet to hack on so you can do a mind-meld with their code and can ask questions about it in context.

Yes, having the use-cases matters, but nobody should be allowed to pretend that they're a substitue for understanding the challenges of an architecture.


While I've only covered a few of the very worst spec-building tropes, it's by no means the end of the rhetorical shit list. Perhaps this will be a continuing series -- although I'm sure this post will offend enough folks to make me think twice about it. If we put an end to just these, though, a lot of good could be accomplished. Here's to hoping.