tl;dr version: Henri Sivonen’s arguments against vendor prefixing for CSS properties focus on harm without considering value, which in turn has caused him to come to a non-sensical set of conclusions and recommendations. Progress is a process, and vendor prefixes have been critical in accelerating that process for CSS.
For a while now I’ve been hearing the meme resurface from CSS standards folks and a few implementers that “vendor prefixes have failed”. I’d assumed this was either a (bad) joke or that it was one of those things that web developers would scoff at loudly enough to turn the meme recessive. I was wrong.
Henri Sivonen, Mozilla hacker extrordinare, has made the case directly and at length. Daniel Glazman, co-chair of the CSS WG posted a point-by-point response. If you have the patience, you should read both.
Lost in the debate between “browser people” and “spec people” is the the essential nature of what has happened with prefixes: they worked. From the perspective of a web developer, any first approximation of the history of vendor prefixes are pure win, even if only a fraction of the value that has been delivered behind them is attributable to prefixes un-blocking vendors from taking risks and shipping early.
Daniel’s rebuttal to Henri gets a lot of things right, but he gives in on an essential point; by agreeing with Henri that vendor prefixes are “hurting web authors” he wites off the benefits that they’ve delivered — namely the ability of vendors to get things out to devs in a provisional way that has good fallback and future-proofing properties and the ability for devs to build with/for the future in an opt-in, degradable way.
Rounded corners, gradients, animations, flex box, etc. are all design and experience enablers that developers have been able to take advantage of while waiting for the standards dust to settle, and thanks to W3C process, it takes a LONG time to to settle. Yes, that has some costs associated with it. Henri is very worried that browsers that aren’t keeping up quickly will be “left behind” by webdevs who use only one vendor’s prefix. But surely that’s a lesser harm than not getting new features and not having the ability to iterate. And it provides incentive for following browsers to try to make a standard happen. What’s not to love? More to the point, I just don’t believe that this is a serious problem in practice. What front-ender in 2011 doesn’t test on at least two browsers? Yes, yes, i’m sure such a retrograde creature exists, but they were going to be making non-portable content regardless of prefixes. Assuming you’re testing fallback at all (e.g., by testing on more than one browser), prefixes not appearing on some browser are just the fallback case. CSS FTW! Webdevs who don’t test on more than one browser…well, they’re the ones hanging the noose around the neck of their own apps. Vendor prefixes no more enable this stupidity than the existence of the
User-Agent header. Compatibility is a joint responsibility and the best each side (browser, webdev) can hope of the other is some respect and some competence. Cherry picking egregious examples and claiming “it’s hurting the web” seems, at a minimum, premature.
And how did we think we’d get a good standard, anyway? By sitting in a room in a conference center more often and thinking about it harder? Waiting on a handfull of early adopters to try something out in a tech demo and never stress it in practice? That’s not a market test (see: XHTML2), it doesn’t expose developers to the opportunities and tradeoffs that come with a new feature, and doesn’t do anything to address the inevitable need to integrate feedback at some point.
Yes, we could go with Henri’s suggestion that the first person to ship wins by default, never iterate on any designs, and avoid any/all first-mover disadvantage situations, but who among the browser vendors is perfect? And what would the predictable consequences be? I can only assume that Henri thinks that we’ll end up in a situation where vendors coordinate with the CSS WG early to add new stuff, will design things more-or-less in the open, and will only ship features to stable (no flag) when they’re sure of their design. That could happen at the limit, but I doubt it. Instead, the already fraught process of adding new features to the platform will be attempted by even fewer engineers. Who wants the responsibility for having to be perfect lest you screw the web over entirely? Fuck that noise, I’m gonna go work on a new database back-end or tune something to go faster. Browsers are made by smart people who have a choice of things to be working on, and any time you see a new platform feature, it probably came about as the result of an engineer taking a risk. Many times the engineers in a position to take those risks don’t have a great sense for what good, idiomatic web platform features might be designed, so they’ll need to tweak/iterate based on feedback. And feedback is painfully hard to extract from webdevs unless you’ve made something available in a tangible way such that they can use it and discover the limitations. Shipping things only to dev is perhaps a good idea for other aspects of the platform where we can’t count on CSS’s forgiving parsing behavior (the basis for prefixes). Syntax changes for JS and CSS seem like good examples. But for features that are primarily new CSS properties? Oy. Making the stakes even higher, reducing the ability to get feedback and iterate isn’t going to lead to a harmonious world of good, fast standards creation. It’s going to predictably reduce the amount of new awesome that shows up in the platform.
Prefixes are an enabler in allowing the necessary process of use, iteration, and consensus building to take place. Want fewer messes? There’s an easy way to achieve that: try less stuff, add fewer features, and make each one more risky to add. That’s Henri’s prescription, wether or not he knows it, and the predictable result is a lower rate of progress — advocating this sort of thing is much worse for the web and for developers than any of the harm that either Henri or Daniel perceive.
Which brings me to Henri’s undifferentiated view of harm. His post doesn’t acknowledge the good being done by prefixed implementations — I get the sense he doesn’t build apps with this stuff or it’d be obvious how valuable prefixed implementations are for work-a-day web app building — instead focusing on how various aspects of the process of prefixed competition can be negative. So what? Everything worth having costs something. Saying that things “hurt the web” or “hurt web developers” without talking in terms of relative harm is just throwing up a rhetorical smoke screen to hide behind. If you focus only on the costs but write the benefits out of the story of course the conclusion will be negative. In many cases, the costs that Henri points out are correctly aligned with getting to a better world: having to type things out many times sucks, creating demand among webdevs for there to be a single, standardized winner. Having multiple implementations in your engine sucks, creating demand from vendors to settle the question and get the standards-based solution out to users quickly. Those are good incentives, driven by prices that are low but add up over time in ways that encourage a good outcome: a single standard implemented widely.
And as Sylvain Galineau pointed out, what looks like pure cost to one party might be huge value to another. I think there’s a lot of that going on here, and we shouldn’t let it go un-contextualized. The things that Henri sees as down-sides are the predictable, relatively minor, costs inherent in a process that allows us to make progress faster and distribute the benefits quickly, all while minimizing the harm. That he’s not paying the price for not having features available to build with doesn’t mean those opportunity costs aren’t real and aren’t being borne by webdevs every day. Being able to kill table and image based hacks for rounded corners is providing HUGE value, well ahead of the spec. Same for gradients, transitions, and all the rest. Calling prefixed implementations in the wild a bad thing needs to argue that the harm is greater than all of that value. I don’t think Henri could make that case, nor has he tried.
I think the thing that most shocks me about Henri’s point of view is that he’s arguing against a process when in fact the motivating examples (transforms, gradients) have been sub-optimal in exactly the better-than-before ways we might have hoped for! Gradients, for example, saw a lot of changes and browsers had different ideas about what the syntax should be. Yes, it’s harder to get a consistent result when you’re trying to triangulate multiple competing syntaxes, but we got to use this stuff, get our hands dirty, and get most of the benefits of the feature while the dust settled. Huzzah! This is exactly> the way a functioning market figures out what’s good! Prefixes help developers understand that stuff can and will change, and they clear the way for competition of ideas without burdening the eventual feature’s users with legacy bagage tied to a single identifier.
So what about the argument that there might be content that doesn’t (quickly?) adopt the non-prefixed version, or that vendors can’t remove their prefixed implementations because content depends on it?
To the first, I say: show me a world where 90+% of users have browsers support the standard feature and I’ll show you a world in which nobody (functionally) continues to include prefixes. That process is gated in part by the WG’s ability to agree to a spec, and here I think there’s real opportunity for the CSS WG to go faster. The glacial pace of CSS WG in getting things to a final, ratified spec is in part due to amazingly drawn-out W3C process, and in part a cultural decision on the part of the WG members to go slow. My view is that they should be questioning both of these and working to change them, not blaming prefixes for whatever messes are created in the interim.
As for removing prefixes, this is about vendors just doing it, and quickly. But the definition of “quickly” matters here. My view is that vendors should be given at least as long as it took to get a standard finalized from the introduction of their prefixed version for the removal process to be complete. So if Opera adds an amazing feature behind a
-o- prefix in early 2012 and the standard is finalized in 2014, the deprecation and eventual removal should be expected to take 2 years (2016). This has the nice symmetry of incentives that punish the WG for going slow (want to kill prefix impls? get the standard done) while allowing the vendors who took the biggest risks to provide the softest landings for their users. And it doesn’t require that we simply go all-in on the first person’s design to ship. Yes, there will be mounting pressure to get something done, but that’s good too!
The standards process needs to lag implementations, which means that we need spaces for implementations to lead in. CSS vendor prefixes are one of the few shining examples of this working in practice. It’s short-term thinking in the extreme to either flag the costs associated with them as either justifying their removal or even suggesting that the costs are too high.
And webdevs, always be skeptical when someone working on an implementation or a spec tells you that something is “hurting the web” when your experience tells you otherwise. The process of progress needs more ways to effectively gauge webdev interest, collect feedback, and test ideas. Not fewer or narrower channels.