Inadmissible Arguments

I spend a lot of time working in, on, and around web standards. As a part of this work, several bogus perspectives are continuously deployed to defend preferred solutions. I hereby call bullshit on the following memetic constructs:

“That’s just a browser caching problem.”

Look for this to be deployed along standards-wonk classics such as “our job should just be to provide low-level primitives”, “we don’t know enough to solve this”, and “people will just write tiny libraries to provide that”.

“It’s a caching problem” is a common refrain of those who don’t build apps and aren’t judged on their latency. And it’s transparently bullshit in the web context. It relies on you forgetting — for just that split second while it sounds possible to duck the work at hand — that we’re talking about a platform whose primary use-case is sending content across a narrow, high-latency pipe. If you work in web standards and don’t acknowledge this constraint, you’re a menace to others.

Recent history alone should be enough to invalidate the caching argument; remember Applets? Yeah, Java had lots of problems on the client side, but what you saw with client-side Java were the assumptions of server-side engineers (who get to control things like their deployment VM versions, their hardware and spindle speeds, etc.) imported to distributed code environments where you weren’t pulling from a fast local disk in those critical moments when you first introduced your site/app/whatever to the users. Instead, you saw the piles upon piles of JARs that get created when you assume that that next byte is nearly free, that disk is only 10ms away, and that cold startups are the rare case. It worked out predictably and Java-like systems succeed on the client when their cultural assumptions do align with deployment constraints — Android and iOS are great examples. Their mediated install processes see to it.

Back out here on the wolly web, caching is something that’s under user control. It must be for privacy reasons, and we know from long experience that users clear their caches. The “cache” they can’t clear is the baseline set of functionality the platform provides — i.e., the built in stuff on all runtimes developer cares about…which is what specs can effect (obliquely and with some delay).

By the time you’re having a serious discussion about adding a thing to a spec among participants who aren’t obvious bozos, you can bet your sweet ass that the reason it was brought up in the first place is a clear and obvious replication of effort among existing users bordering on the ubiquitous — often in libraries they use. Saying that someone can write a library rises no higher than mere truism, and saying that a standards body shouldn’t provide some part of the common features found among those existing libraries because caching will make the cost of those libraries moot is ignorance or standards-jujitsu in an attempt to get a particular proposal shelved for some other reason.

As bad as the above is, consider the (historically prevalent) use of this argument regarding “why we don’t need no stinking” markup features — just “fix” browser caching and “give me a low-level thing and I’ll build [rounded-corners, gradients, new layout mechanisms, CSS3d xforms, etc.] myself”. The subtle bias towards JavaScript and away from a declarative web is one of the worst, most insidious biases of the already enfranchised upper-class of JavaScript-savvy web developers, perpetuating a two-tier web in which “real engineers” can do better every year but wherein those same expressiveness and performance gains aren’t transmitted to folks without CS degrees (yes, I’m looking straight at you, WebGL). You almost want to give it to them, though, so they’ll finally come to terms with how wrong they really are about the ability for caching to be “fixed”.

We’ve spent a decade on this — remember that we’re all using CDNs, pulling our JS libraries from edge-cached servers with stable URLs for optimal LRU behavior, running minifiers, setting far-forward expires, etc. And that’s just what webdevs are doing: meanwhile browser vendors have been working day and night to increase the sizes of caches, ensure full caches where possible, and implement sometimes crazy specs that promise to help. It’s not for lack of trying, but we still don’t collectively know how to “fix” caching.

Think libraries are free? Show me how you’ll make ‘em free and I’ll start taking you seriously. Until then, bozo bit flipped.

“It should have exactly the same API as library X”

Similar arguments include “it must be pollyfillable”, etc.

Why is this bullshit? Not because it represents an aversion to being caught in an implementation/deployment dead zone — that’s a serious concern. Nobody wants a better world dangling out there just beyond reach, waiting for Vendor X to pick it up and implement or for Version Y of Old Browser to die so that 90% of your clients can acces the feature. That’s where libraries provide great value and will continue to, no matter what the eventual feature looks like. Remember, the dominant libraries in use by developers don’t have Firefox, Chrome, IE, and Opera-specific versions that you serve up based on client. The platonic ideal is 180-degrees in the other direction: one code base, feature detection, polyfills, progressive enhancement — basically anything within (and often well outside of) reason to keep from serving differential content. So your library is going to have all those versions anyway until all of your clients have the new-spec version. Optimizing based on existing API design because “now we can polyfill it without extra effort!” misunderstands both the role of spec designers and of libraries and serves users very poorly indeed.

Where this argument becomes truly inadmissible, though, is when it seeks to define the problem to be “what our API already does”. Turns out that the language you can design features with when all you have is JavaScript and HTML/CSS patterns is…well…the JS, HTML, and CSS you have today. Powerful yes. Expressive? Hrm. If your job, however, is to evolve an integrated platform, taking on a constraint predicated on the current form of your systems is nuts. There are some hard constraints like that (backwards compatibility), but adopting some form of hack as the “blessed” way of doing something without looking around and going “how can we do this better given that we have more powerful and expressive language to solve the problem with?” is nothing but lost opportunity.

Yes, a standards group might look around and go “nope, we can’t do better than that polyfill/library” and adopt the solution wholesale. That’s not a bad thing. What is bad, though, is advocacy about far-future solutions to current problems based solely on the idea that some library totally nailed it and we shouldn’t be asking for anything more — e.g.: Microdata cribbed from RDFa and Microformats when what you really wanted was Web Components. Rote recital of existing practice is predictably weak sauce that robs the platform of its generative spark. We should all be hoping for enhancements that make the future web stronger than today’s, and you only find those opportunities by taking off the blinders and giving yourself free reign to do things that libraries, polyfills, and hack just can’t.

What to do about the developers and users caught in the crossfire? Well, we can advocate for degradable, polyfill-friendly designs, but there are limits to that. The most hopeful, powerful way to make the pain disappear is to ensure that more of our users year-over-year are on browsers/platforms that auto-update and won’t ever be structurally left-behind again. And yes, that’s something that every web developer should be working towards; prompting users to update to current-version auto-upgrading — “evergreen”, if you will — browsers. Remember: you get the web you ask your users for.

“Just tell us the use cases”

This is the one I’m most sick of hearing from the mouths of standards wonks and authors. What they’re trying to say is some well-meaning combination of:

  • If you tell us exactly how you’re doing whatever it is you’re accomplishing today, we’ll run the risk of just adopting some API that was designed with some bogus constraints (see above).
  • We need to communicate to a lot of people involved in this effort why they should care, ’cause putting stuff into browsers is a shitload of work.

What it often winds up doing, however, is serving as a way for a standards body or author to shut up unproductive folks; folks who aren’t willing to do the work of helping them come to enlightenment about the architecture of the current solution, the constraints it was built under, and the deficiencies it leaves you with. Or it can be used to avoid that process entirely — which is where we’re getting towards bullshit territory. Taken to the extreme (and it too often is) the “use-cases, not details” approach infantilizes standards body participants, setting a bar too high for the folks who are crying out for help and setting the bar too low for the standards body because, inevitably, the tortured text of the “use cases” is an over-terse substitue for a process, a way of building, and an architectural approach. Use-cases (as you see them for HTML and CSS in particular) become architecture-free statements, no more informative than a line plucked at random from Henry V. Suggestive, yes. Useful? Nope. The very idea of building standards without a shared concept of architecture is bogus, and standards participants need to stop hiding behind this language. If you don’t understand something, say so. If many people work hard to explain it and can’t, ask for a sample application or snippet to hack on so you can do a mind-meld with their code and can ask questions about it in context.

Yes, having the use-cases matters, but nobody should be allowed to pretend that they’re a substitue for understanding the challenges of an architecture.

While I’ve only covered a few of the very worst spec-building tropes, it’s by no means the end of the rhetorical shit list. Perhaps this will be a continuing series — although I’m sure this post will offend enough folks to make me think twice about it. If we put an end to just these, though, a lot of good could be accomplished. Here’s to hoping.


  1. Posted August 15, 2012 at 10:55 am | Permalink

    “Just tell us the use cases” is a bit of a strawman, though I see where it comes from.
    “Document your use cases” is a different statement. I’m glad that this is now common enough for you to call it out as a failing; to me that represents progress over the ‘Here’s my theoretical feature that I am going to make this standards body legislate into existence” approach.
    Is part of the problem that extending browsers via native code plugins is now so far out of fashion that we don’t have models for behaviour we want them to adopt? Without QuickTime and WindowsMedia to look at for video or Flash to look at for canvas, would they have happened?

  2. Posted August 15, 2012 at 1:23 pm | Permalink

    I feel to be out of the context. What does the “pollyfill” term mean? Some sort of backward compatibility with previous api?

  3. Posted August 15, 2012 at 2:35 pm | Permalink

    Well said, sir! An excellent article, and many excellent points. It’s good to hear a voice on the outside echoing what the voice in my head has been screaming for years.

    @Aliaksandr: It’s somewhat the opposite, actually – it’s like forward compatibility for old browsers. If you’re using IE6, the only way to get nifty new APIs and HTML5/CSS3 features is to “fill” in the missing pieces with Javascripts that serve the same purpose. Remy Sharp put it well:

  4. Patrick H. Lauke
    Posted August 16, 2012 at 11:11 am | Permalink

    …now trying to remember what I said at our meeting on appCache the other day, and see if I offended or enlightened…

  5. Posted August 17, 2012 at 4:03 am | Permalink

    Kevin: you end with a fascinating question. My guess is that “native” platforms are the new “plugins” in this respect. They’re what’ll keep those of us who care about the web up at night, worried that we’ll be eclipsed and/or doomed to some platform backwater. Goodness knows that’s the explicit strategy of every native platform — it has to be — and why I’m grateful that ChromeOS and B2G are out there still working towards the “the web is the inevitable model” end-game.

    But the fact that native platforms aren’t as all-up-in-your-business as plugins is an interesting thought. It might help create a blind spot in the competitive assessments of those of us who work on the web platform, encouraging us to take the “fallback system” view of the web and accept mediocrity too willingly.

  6. Posted August 23, 2012 at 12:19 am | Permalink

    With so much (healthy) competition, it’s important for startups to be innovating at the sharp end. We’ve got to focus on the tiny bit of technology that’s unique to our product and innovate around that. With such a small team (there’s 2 of us), any time we spend doing stuff that’s been done before effectively lessens our chances of differentiating our idea from the thousands of others being developed at the same time. I agree with you Alex on the two-tier system. Even those who have CS degrees need the basic stuff to get easier so we can concentrate on the hard bit of our app. We all waste time on the poly-fills in the same way that 5 years ago, we used to waste time on hacks for non-complaint browsers.

  7. Posted September 8, 2012 at 10:23 pm | Permalink

    I try to avoid polyfills. They tend to hit the CPU of PCs running older browsers to the point of creating a poorer experience than simply leaving out a feature.

    Regarding serving conditionally optimized JS or CSS, there is a business being built around this by companies like Strangeloop, Akamai and many others. Google site performance and most companies offering measurement tools also offer optimization.

    I am all for including common functionality into the browser platform, native rather than plugins being optimal. Asking users to download a plugin? May as well go back to Flash then. Chrome frame is an exception which seems to be the best solution for Polyfill type forward compatibility. If MS would offer their own version of this with integration into security settings we could get corporate America into the modern age and allow for an evergreen experience there as well.

    Thanks for your efforts. Keep fighting.