The W3C Cannot Save Us

Things are finally moving over in CSS-land. On the positive side CSS column layouts are looking pretty nice, having dropped their dependency on the the janktastic “advanced” layout module and there’s some initial movement on improving the CSS-OM.

But all is not well, nor has it been for a long, long time. No work on hbox or vbox or mixins/inheritance. There’s also no indication that the WG has taken stabs exposing the expressive power that Microsoft exposed with CSS expressions.

More importantly, Andy Clarke is pretty disgusted by what he sees of the process and participants and so, apparently, is David Baron. Anyone who has met David in person will probably understand how much of a big deal this really is. It signals the effective end of the CSS WG as we (don’t) know it. Rebuilding credibility in the WG is going to be much, much harder now that Mozilla’s representative has effectively given up on the closed-door process. The working group’s secret cabal style of operation is imploding. It was inevitable, but the timing is still a surprise.

But why was it inevitable? And should we take Andy’s suggestion seriously and expect a re-chartered WG to do better? After thinking about it for a while, I think the answer is that we can’t expect any standards body to do what is being asked of the CSS WG; namely to invent the future by committee. It’s particularly unreasonable to expect progress when deployed browsers provide so much momentum that the safest thing to do is to solve the little problems and ignore the large ones. This is also a well-worn stalling tactic used by vendors playing for time as to marginalize the standards process or solidify their lead such that it becomes a de-facto standard. This is a good reason to reject closed working groups, but the web standards to which advocates now cling were developed in those same closed systems. Closed clearly can’t be all bad nor can it be a primary cause for the lack of progress. We seem to be hugely confused and conflicted about the relationship between standards, vendors, and how we get to a better future.

Before I lay out all the forces involved, let me first say what everyone knows but few truly accept:

In order for the future to be better by a large amount, it must be different by a large amount.

I think that statement alone is enough to indict Opera’s anti-trust actions as stupid and ill-considered. But we should also recognize that it forms the basis of Opera’s grievances. We should all be pissed off that the discussion today hinges on how we will get MSIE to improve by slight degrees (or should we expect more?). Opera could have done better, though, by shipping Gears and working with Google to make it a host for pluggable renderers…like Opera. Too bad Opera has prioritized proving a point over actually improving the situation. But I digress.

So what was so different about the late 90′s that it allowed a closed process to make huge gains in a short order while we can’t even get basic architectural issues addressed in a timely fashion today?

The first major reason is that web developers in the 90′s were looking forward, not backward. I remember being excited about getting the chance to use new features and not caring who gave them to us. As a community, web developers hadn’t collectively “picked sides”. I think the market as a whole still has that essential optimism built in, it’s just that the more that self-identified web developers focus on how standards compliant things are (or aren’t), the more we lose the sense that it can get better. So platforms not tied to standards race ahead. Why is anyone surprised that Adobe has essentially kicked HTML’s ass with Flex or that Microsoft feels it can do the same with Silverlight in a couple of revs? If you buy either Adobe or MS’s lines about how these platforms aren’t there to replace HTML, I’d like to sell you some prime property in the Pacific.

But, I hear you yelling, it sucked to build things back then! It was hard! We didn’t know what bits of the technology we could use portably, and the W3C saved us from that mess!

Oh really?

Try teaching a good programmer without a web background to build anything reasonably sophisticated with web technologies today. Doing so will teach you painfully, embarrassingly that there are huge tracts of the HTML, CSS, and DOM specs that you simply can’t use. IE’s bugs combined with its market share conspire to ensure that’s true, and we wouldn’t get off the hook should IE 8 magically transform into a perfect reference implementation. Mozilla, Opera, and Safari all have their own warts as we get to the edges of what’s theoretically possible with current specs. And that’s not even taking into account how utterly wrong, broken, and silent the specs are in several key areas. Even if the specs were great, we’d still be gated by the adoption of new renderers.

It’s clear then that vendors in the market are the ones who deploy new technologies which improve the situation. The W3C has the authority to standardize things, but vendors have all the power when it comes to actually making those things available. Ten years ago, we counted on vendors introducing new and awesome things into the wild and then we yelled at them to go standardize. It mostly worked. Today, we yell at the standards bodies to introduce new and awesome things and yell at the browser vendors to implement them, regardless of how unrealistic they may be. It doesn’t work. See the problem?

Andy is probably wrong to suggest that a new working group (no matter the structure) can succeed in fixing the impasses of the existing CSS WG if only because no working group at the W3C has the power to effectively and definitively introduce new things into the market. Only browser vendors can do that and no amount of buy-in at the W3C can force vendors to implement. The last decade is proof enough of that. We need to wise up to this key point: standards bodies are downstream of implementations, and that’s the only time when they work well.

Until we get some great new (non-standard) CSS features out Mozilla, Opera, and IE nothing will get better to the extent that we will again be optimistic about the future (Safari earns a pass). The size of the improvements they deliver in the future are directly tied to our expectations of how different the future will be. Only when there are large and divergent ideas of how to proceed expressed through competing, successful implementations will standardization really work to whatever extent that it can reasonably be expected to.

Let that sink in a bit. To get a better future, not only do we need a return to “the browser wars”, we need to applaud and use the hell out of “non-standard” features until such time as there’s a standard to cover equivalent functionality. Non-standard features are the future, and suggesting that they are somehow “bad” is to work against your own self-interest.

Web developers everywhere need to start burning their standards advocacy literature and start telling their browser vendors to give them the new shiny. Do we want things to work the same everywhere? Of course, but we’ve got plenty of proof to suggest that only healthy browser competition is going to get us there. Restructuring the CSS WG or expecting IE8 to be “fully standards compliant” is a fools game.

Put simply, Zeldman is hurting you and only you can make it stop. Neither the CSS WG nor the HTML 5 WG nor, indeed, any W3C working group can define the future. They can only round off the sharp edges once the future becomes the past and that’s all we should ever expect of them. As much as they tell us (and themselves) that they can, and as much as they really would like to, the W3C cannot save us.

Update: the implosion continues apace. Hixie outs Microsoft’s rep as stalling on embeddable fonts and runs up the flag on why he thinks it’s just not working in general. Interestingly, one of the MS reps has seemingly seconded dbarons “lets do this in public” sentiment as Hixie smacks down Bert Bos’s leadership of the CSS WG. It’s easy to get caught up in the horse race on this, but I want to again suggest that none of this will get better until the browsers start taking risks. Hopefully openness and transparency will allow the world to judge the actions of the WG until then without removing the ability for the WG to function when the cavalry does arrive.

29 Comments

  1. Robb Greathouse
    Posted December 16, 2007 at 9:50 pm | Permalink

    Could this be done by going completely open source?

    Produce pluggins for Mozilla. And wait for Microsoft and Opera to follow.

    Boinc now supports human based projects as well grid computing. A steering committee would be needed to provide some coordination; but that could be through Boinc.

  2. snack
    Posted December 16, 2007 at 11:27 pm | Permalink

    “In order for the future to be better by a large amount, it must be different by a large amount.

    I think that statement alone is enough to indict Opera’s anti-trust actions as stupid and ill-considered.”

    Seems that you are contradicting yourself.

    If Opera’s antitrust complaint is successful, the future will indeed be different by a large amount.

  3. Posted December 16, 2007 at 11:42 pm | Permalink

    A snippet from my blog post commenting on things written on this one:

    …there are huge tracts of the HTML, CSS, and DOM spec’s that you simply can’t use. IE’s bugs combined with its market share conspire to ensure that’s true…Mozilla, Opera, and Safari all have their own warts as we get to the edges of what’s even theoretically possible w/ current specs. And that’s not even taking into account how utterly wrong, broken, and silent the specs are in several key areas. Even if the specs were great, we’d still be gated by the adoption of new renderers.

    So, it is still like the 90’s but to a lesser degree, right? When it comes to building something, anything for the web, designers and developers have to take the different browsers and how they render things into consideration.

    The difference between the 90’s and now is that the industry began to care about making websites that didn’t have badges that stated “Best viewed in [insert browser name.version].” Web designers and developers didn’t want to write markup and/or code specifically aimed at particular browsers or, God forbid, multiple versions of the same site/application to target all of them.

    Why would anyone want to return to that?

  4. Posted December 17, 2007 at 1:54 am | Permalink

    “If Opera’s antitrust complaint is successful, the future will indeed be different by a large amount.”

    Indeed. If you think the web will be a better place by sprinkling more proprietary technology onto it, then that’s what you have in Flex and Silverlight. That’s the bright “future” you’re describing, where vendors dictate where to go and how to get there. How exactly can W3C standardize proprietary technology like that?

    Even if Flex and Silverlight is implemented in all the major browsers, how can backwards, forwards and cross platform compatibility be ensured when only one vendor has complete control over the technology? Sure, W3C can reverse engineer the whole platform and publish specifications based on the work, just like they are doing now with XMLHTTPRequest, but do you really think that’s the best we can do with web standards and development in 2007? Is that as good as it gets?

    No, the world of web development is so many orders of magnitude better today than it was 10 years ago that even suggesting that going back to the old ways is better is just complete and utter madness.

  5. Hemebond
    Posted December 17, 2007 at 3:50 am | Permalink

    Meh. Disagree.

  6. Posted December 17, 2007 at 4:42 am | Permalink

    So what was so different about the late 90’s that it allowed a closed process to make huge gains in a short order while we can’t even get basic architectural issues addressed in a timely fashion today?

    Could it be that the technology is getting mature? And mature technologies do not evolve as fast as emerging ones.

    I remember being excited about getting the chance to use new features and not caring who gave them to us.

    Did you care who could use the sites?

    Web developers everywhere need to start burning their standards advocacy literature and start telling their browser vendors to give them the new shiny

    The web us not about “shiny”. The web is about sharing information – and sharing it in such a manner, that anyone can access it regardless of their choice of browser.

  7. Posted December 17, 2007 at 5:21 am | Permalink

    snack, Asbjørn:

    You’re entirely right that the Opera action, if successful, will yield a largely different future, but one that I’m afraid would suck. I’m all for things that improve competition, but I don’t think this will really do it. If you look at Operas statements about the suit, they look to obligate Microsoft to follow the letter of web standards, not to make significant progress of the type that will be needed to stave off the likes of Flex and Silverlight. If I were a product manager for Flex, I’d be jumping for joy right about now. The strategic implications of the lawsuit are clear, and they’re not pretty.

    Lawsuits, like militaries, are blunt instruments and Opera’s complaint may do some good but may also provide Microsoft with a convenient reason not to discuss standards conformance in an open way with the community since what they say may eventually be used as evidence in ongoing litigation. And what if Opera succeeds? I’m all for the un-bundling of IE from Windows. Microsoft’s strong-arm tactics with computer manufacturers with regards to default software is painful and well-documented. Changes to that would be great. But what would a remedy that compels “standards compliance” really look like? I’m guessing it would look a lot like a delaying tactic which keeps our eyes off the larger prize: keeping the open web competitive for building ever-more sophisticated applications and visualizations.

    Opera should have asked instead for detailed, public plans from Microsoft regarding upcoming browser versions and a court order compelling them to meet those plans. That at least would be a remedy that spoke to the future and not the past.

    Regards

  8. Marty McKeever
    Posted December 17, 2007 at 7:18 am | Permalink

    Preach it brother!

    Standards cannot drive innovation, they can only help clean up the mess innovation leaves in her wake.

  9. Posted December 17, 2007 at 7:46 am | Permalink

    To get a better future, not only do we need a return to “the browser wars”, we need to applaud and use the hell out of “non-standard” features until such time as there’s a standard to cover equivalent functionality.

    Funny, I said the almost the same exact thing this summer.

  10. Les
    Posted December 17, 2007 at 9:56 am | Permalink

    There will be no browser wars if MS remains so dominant. Sure, its nice to think that developers will implement “new shiny” to a large extent but in the real world clients pay us to make things that work for large amounts of visitors and that means IE.

  11. Dave
    Posted December 17, 2007 at 11:13 am | Permalink

    A dose of pragmatism is always welcome in a sea of often unrealistic idealism. Thanks.

    You don’t say it, but I presume you expect Dojo and toolkits like it to play a key role in making it possible to release products across uneven, shifting feature sets?

  12. digginestdogg
    Posted December 17, 2007 at 1:13 pm | Permalink

    Alex raises a few good points here and there about the problems with the W3C, but his faith that right path is letting vendors lead in standards (innovation yes, but standards no) is not only unwarranted, but disproved by history. If we do it his way, Microsoft IE will become the standard. Period. And, the way MS plays it, there be no room for any others. Don’t take my word for it. Take a look at the ECMA OOXML ISO standard bid shenanigans with all it’s obfuscation, secrecy, and committee packing. You think the W3C is bad–look at that one.

    Let MS lead and open international standards will go the way of the Dodo bird. We might as well all just run Windows and bow before our new overlords in Redmond.

  13. Posted December 17, 2007 at 1:26 pm | Permalink

    Proprietary web development with the purpose of standardization is much better than proprietary, period. The latter is inevitable, so what do we do about it?

    This is pretty much the point here.

  14. snack
    Posted December 17, 2007 at 1:27 pm | Permalink

    alex: “If you look at Operas statements about the suit, they look to obligate Microsoft to follow the letter of web standards, not to make significant progress of the type that will be needed to stave off the likes of Flex and Silverlight.”

    I disagree completely. Everyone is free to innovate and do good stuff. Microsoft just has to follow through on its promise to implement standards correctly and be a good netizen, that’s all.

    “The strategic implications of the lawsuit are clear, and they’re not pretty.”

    On the contrary, they are extremely promising. No more having to code for separate browser. No more having to add support for standards AND be compatible with IE at the same time for other browser vendors. Everyone saves bucketloads of money, and users will have an actual choice.

    “But what would a remedy that compels “standards compliance” really look like?”

    It is not all that complicated. There is a set of standards that are more or less agreed upon. Heck, if Microsoft could just implement the standards it claims to support correctly, we would have come a long way.

    Marty McKeever: “Standards cannot drive innovation”

    On the contrary. Standards are the basis of innovation. All industries are based on standards. You build you innovation on top of those, and not by violating those standards. Why should the browser industry be any different from the food industry, the oil industry, etc.?

  15. Posted December 17, 2007 at 1:34 pm | Permalink

    Dave: I don’t expect Dojo (or other toolkits) to play a big role, actually. We just don’t have enough leverage, although we can improve the situation and paper over the holes to some small extent. I’ve heard that some browser vendors expect toolkits to be “where it’s at” and I think they’re fooling themselves to a large degree or at least trying to let themselves off the hook. If browser vendors really want to enable toolkits to make a difference in this fight, they need to provide a way to keep them from being perpetually re-requested off the network and deliver serious performance improvements and hooks into the parts of the page lifecycle that they expect us to manage.

  16. Walter
    Posted December 17, 2007 at 8:56 pm | Permalink

    How about providing an estimate of how much money MSIE’s deviations are costing companies and using that to build support for an outright boycott? Get a lot of competitors to agree to serve up a page that says “This page is not viewable in Internet Explorer, for the following reasons: … Please try one of the following browsers instead.”

    Make it a movement, like that Facebook anti-Beacon petition that worked so well.

  17. Posted December 19, 2007 at 12:59 am | Permalink

    In the end, we, the web working class (aka web developers, HTML/CSS grunts) will be the ones to explain to the clients/designers/company owners, that the desired design will be doable in one browser and not in the other.

    In the end all we will be able to do is to use the lowest common set of our tools to build web pages.

    As you all know, they (clients/designesr/company owners) all want their designs to look exactly the same on all platform/browser combinations.

    Maybe it’s time to start learning Silverlight/Flash?

  18. theTree
    Posted December 19, 2007 at 3:36 am | Permalink

    Driving standards and innovation forward through browser competition does sound healthy and Darwinian, but as Alex rightly notes himself, cool new idea’s are not implementable unless they are widely supported, or degrade gracefully.

    Widely supported means MSIE, in terms of user support. And this really is decidedly un-darwinian: natural-selection of new technologies cannot take place with the supremacy Microsoft inflicts upon everyone – new ideas will not be adopted if MS choose not to implement. This is the reason why great swathes of cool features that are adopted by everyone else aren’t in use today – it would be an incredible transformation of the web if TODAYS standards were fully supported!

    The only body with the unified power to implement this is the W3C, so whilst I agree they should be solidifying the present set of standards, I worry that leaving innovation and new feature sets to the (flawed) competition of the browser vendors, would lead to no innovation at all.

  19. Posted December 19, 2007 at 6:02 am | Permalink

    Kasimir said:

    The web us not about “shiny”. The web is about sharing information – and sharing it in such a manner, that anyone can access it regardless of their choice of browser.

    The web may not have been intended to be shiny, but people try to make it shiny nonetheless. If an open standard like CSS doesn’t provide them with the tools to make it so, many will flock to Flex and Silverlight.

    What would we rather have? It seems to me that a mostly standardized web with all the content in an open, human-readable format with a relatively thin jungle of (human-readable) proprietary code on top wouldn’t be as bad as the other option, a completely standardized web with all the content in a closed, binary format on top of a thin layer of open, human-readable code (think of all the Flash sites with just a single <embed> element in the body).

  20. Wade Harrell
    Posted December 19, 2007 at 11:56 am | Permalink

    and what have we learned from history?

    dateline: July 26, 2000
    http://www.xml.com/pub/a/2000/07/26/deviant.html

    spoiler: 7+ years later and SVG is all but dead….

  21. Sebastian
    Posted December 19, 2007 at 12:18 pm | Permalink

    The solution is not in the alternative browsers adding shiny new features, because as long as IE is the standard and it doesn’t even cover the basics, you can’t use it.
    I had this conversation many times with my boss, when some CSS or something would make a better solution but couldn’t be implemented because IE didn’t support it.
    For example, I had to custom make a HTML/JS bar charts when I could have used SVG because the client didn’t plan on using Opera or FireFox.

  22. kibbles
    Posted December 19, 2007 at 3:32 pm | Permalink

    it sounds to me the frustrations are w/ the standards bodies — their slow speed and lack of transparency. thats fine. but thats independent of their actual work — spec’ing how features should work. certainly, there are new things that W3C hasnt explored as quickly as id like (eg, css expressions).

    if i could use them all, id be happy w/ the palette CSS2 & 3 provide. but even today, we still cant assume the average browser is going to support them. thats sad. and thats the browser vendors fault. period.

    side note: if one wants a vector-based, animation-based format, start using it. but HTML isnt flash, and it isnt supposed to be. flash is supposed to be flash.

    yeah, it would be nice if, after “Browser X” comes up w/ a Super Sweet New Feature ™, it could get on the W3C standards spec faster. but thats an issue w/ their process… and it doesnt mean we’re stagnant as a field, nor that we should waive the “fuck standards!” flag (which i am reading repeated). it just means we’re maturing a bit, which means going a little slower. like other industries…ISO-9000 committee work is probably pretty boring, too.

    “I remember being excited about getting the chance to use new features and not caring who gave them to us.” — yes, and this prompted “Best Viewed In XXX” disclaimers. lame.

  23. Posted December 19, 2007 at 3:37 pm | Permalink

    Wade:

    All but dead? We’ve seen two new, high quality, high-performance implementations in the last couple of years, and really great abstractions like dojox.gfx that make it completely portable to differing back-ends. While SVG may not be the apparent winner, *we got the new features regardless*, and despite SVG’s serious suckage as a spec, it may yet win. Long story short, we can use vector graphics to draw everywhere today.

    If that’s not a success story, I don’t know what is.

    Regards

  24. Wade Harrell
    Posted December 20, 2007 at 10:32 am | Permalink

    @alex: Inline SVG did not happen. Compare the current state to where the Adobe plugin was at in the initial release and even those few browsers that offer limited support are just scratching the surface. Granted netzgesta.de/cvi/ is doing impressive work with canvas, but when I consider the work my team did for battlebots.com in 2000 canvas does not add up to much.

    Of course inline SVG would require real across the board XHTML and even that has not yet happened (I am talking application/xhtml+xml, even though 90% of pages out there use the XHTML doctype as text/html anyways; resulting in soup)

    XHTML is a perfect case of user demand being ignored by application vendors. it is a feature everyone wants, it is standardized even, but the key player has yet to support it. I generally do not bash MS, XHR and client-side XSLT since IE5 gave them big gains in my book, but the XHTML thing is a sore point. In context of their early XML dedication I have never fully grasped that choice…

    MS got burned on SVG, they wanted VML, so SVG will never happen. They introduced XHR and everyone else quickly followed their lead.

    I agree that the standards should mostly be a map of the common ground, a “safe list” of features that can be used with neutrality. Sometimes, as in the case of SVG, they are born from an attempt to identify that common ground when implementations have greatly diverged. Unfortunately corporate egos come into play at that point. Vendor innovation to serve their own needs have given us some of the most critical technologies used by web developers today. Vendor resistance to implementing the desires of users or standards bodies have stunted the growth of the industry (a bubble burst did not help either). It is a double edged sword. Neither direction is 100% right or wrong.

    I want the innovation (from all vendors), and I want to use that innovation (from all vendors) in a standard way that will give the same minimum set of results in all environments.

  25. Posted December 20, 2007 at 11:35 am | Permalink

    Wade: Actually, inline SVG doesn’t depend on XHTML. The SVG WG and the HTML WG are currently discussing how to get inline SVG in text/html. And with script, you can already do it today in Opera, Firefox, and Safari.

  26. Posted December 22, 2007 at 3:04 pm | Permalink

    Do you seriously want to go back to the old model of:

    Browser with the most market-share = Gatekeeper to the Web.

    Really? I just can’t wait for a proprietary system that takes a service contract to get into to even code for it. Take a look at locked and crippled phones from wireless carriers, because something similar is how I’d envision that would end up.

  27. Posted March 2, 2008 at 5:45 pm | Permalink

    Alex, I just found your blog and I’m loving it. You have the right idea about the web, and it’s great to see so many of the ideas I’ve had kicking around in the back of my head expressed so well.

  28. Posted July 31, 2008 at 4:27 pm | Permalink

    You’re definitely right that innovation doesn’t come from the standards bodies, it comes from the market. I recently started blogging at the above link about what a new thin client might look like. I talk in the second post about how to keep new software platforms open while encouraging innovation. Any ideas and feedback would be appreciated.

  29. Posted November 18, 2009 at 12:37 pm | Permalink

    to Marty McKeever and his relpy: Standards cannot drive innovation.. Marty, you right in first part of your sentences, but not in second one… The most of standarts exist for reinventing innovations

35 Trackbacks

  1. [...] Alex Russell calls for a return to the browser wars, citing (among other things) the stagnancy of the W3C as a part of the problem, with the argument that browser makers are the ones who can innovate and they’re being prevented from doing so by a slavish insistence on “standards”. Meanwhile, Andy Clarke calls for the current W3C CSS Working Group to be immediately disbanded, Opera file an antitrust complaint against Microsoft, the HTML5 spec removes a recommendation for non-patent-encumbered video formats after pressure from Nokia and Apple, and all the old fights start up again. Fire and brimstone coming down from the skies. Rivers and seas boiling. Forty years of darkness. Earthquakes, volcanoes. The dead rising from the grave. Human sacrifice, dogs and cats living together. Mass hysteria. [...]

  2. [...] Alex Russell (via)   [...]

  3. By Shallow Thoughts » Return to the Web of the 1990’s? on December 16, 2007 at 10:48 pm

    [...] To be fair, Jeff’s comments are based on what he read in Alex Russell’s latest blog entry, The W3C Cannot Save Us. So, I went and read what Alex said (edited to pull out my point — and emphasis mine): …there are huge tracts of the HTML, CSS, and DOM spec’s that you simply can’t use. IE’s bugs combined with its market share conspire to ensure that’s true…Mozilla, Opera, and Safari all have their own warts as we get to the edges of what’s even theoretically possible w/ current specs. And that’s not even taking into account how utterly wrong, broken, and silent the specs are in several key areas. Even if the specs were great, we’d still be gated by the adoption of new renderers. [...]

  4. [...] There have been a lot of articles around lately about the CSS Unworking Group and why the W3C can’t save us. I think it is time for change, it is time for us to move forward with technology on the web. I think we’re ready. [...]

  5. By Pierre-Luc Babin » Jeff Croft et les standarts web on December 17, 2007 at 9:26 am

    [...] The W3C Cannot Save Us [...]

  6. [...] It seems the future of web innovation is a popular topic nowadays, as Alex Russel points out in his scortching hot “The W3C Cannot Save Us” Article. In the article, Alex pours down hot oil on standards working groups and incites browser vendors to stand out with new non-standard features as a means to get them recognized and standardized eventually. [...]

  7. [...] There’s been a lot of talk lately about the development (or lack of) web standards, but not going to create multiple posts about it (yet). Best summary of everything so far is the “Future of Web Standards” article at b-list.org. Also “Reigniting the Browser Wars” and The W3C Cannot Save Us are excellent reads. Tags: CSS, Firefox, HTML, IE, W3C, Web Standards [...]

  8. [...] There’s recently been a lot of noise about a return to the browser wars (Alex Russell, Jeff Croft, Stuart Langridge, James Bennett). The point being that standards take eons to complete and standards bodies aren’t the right people to be inventing cool stuff for us to use on the web, it’s us and the browser makers that should be creating the cool stuff for the standards bodies to codify. Ok, that all sounds great (albeit an incredible simplification of a multifaceted issue). So, let’s go out and push that envelope. In order for the future to be better by a large amount, it must be different by a large amount. [...]

  9. By Team Hamlett on December 17, 2007 at 9:28 pm

    [...] The W3C Cannot Save Us (tags: standards web html xhtml essay rant) [...]

  10. By QMD - when you want to know » QMD 8 update on December 18, 2007 at 5:52 am

    [...] I just thought an update was needed. We’re so busy trying to nail this down, that we’re not reaching out as much as we’d like. I saw a quote this morning from Alex Russell: “In order for the future to be better by a large amount, it must be different by a large amount.” I think you’re going to see that QMD8 is a huge upgrade. And hopefully you’ll bear with us while it takes longer than expected to deliver it. [...]

  11. [...] The voice of reason in all this seems to be Alex Russell of the Dojo Toolkit. In his article, The W3C Cannot Save Us, he explains that what is really holding the Web back is our fanatical devotion to web standards, and the expectation that they can dictate what new features should be added to web browsers. [...]

  12. [...] Le W3C peut-il encore nous sauver?  [...]

  13. [...] Today, i was thinking about Alex Russell’s “The W3C cannot save us“, written on the last Sunday. Ok, he has exposed some irrefutable truths IMHO, like when he wrote that “expecting IE8 to be “fully standards compliant” is a fools game“; however, there are aspects wherein my opinion diverges from his text. [...]

  14. By Trent’s Blog » Microsoft += Microsoft–; on December 19, 2007 at 8:12 pm

    [...] As far as the heavy computing goes, well Ubuntu Fiesty has been very pleasing, especially since most of my computing needs for programming are well suited for it, where as my leisure computing is well suited for Vista with iTunes and Office.  The two together seem to balance each other out very nicely, Which brings up an interesting topic from a post I read a the other day that not following web standards may actually help… [...]

  15. [...] In response to recent articles by Andy Clarke and David Baron, Alex recently said that the W3C cannot save us. The most significant point being made is that you cannot standardize the future, and you should not punish those who attempt to push the envelope through experimentation and invention. [...]

  16. By W3C v/s Developer Community | iface thoughts on December 20, 2007 at 12:04 pm

    [...] There is also the concern that W3C is hurting innovation. Jeff Croft supports Alex Russel in asking W3C to loosen the noose of standards so that the browser makers do not get curbed. For long many have felt that W3C is working too slow in evolving CSS and HTML standards to solve today’s needs. Andy Budd suggested we have an interim version of CSS. [...]

  17. [...] If you’ve read any web design blogs during the past week you will no doubt be aware of the hornet’s nest that has been stirred up by Opera’s antitrust complaint against Microsoft. The issues at stake go far beyond Opera trying to put a dent in Internet Explorer’s market share, and debate within the web industry has quickly escalated into an argument about the relevance of web standards, and the organizations that govern them. Andy Clarke has called for the dissolution of the W3C CSS Working Group. Alex Russell has accused Jeffrey Zeldman of hurting web developers and called for a return to the browser wars of the 90s. Jeff Croft has stated that “compliance is f**king boring”. Even Wired has provided coverage of the discontent. [...]

  18. [...] Alex Russell discussing why the W3C cannot save us To get a better future, not only do we need a return to “the browser wars”, we need to applaud and use the hell out of “non-standard” features until such time as there’s a standard to cover equivalent functionality. Non-standard features are the future, and suggesting that they are somehow “bad” is to work against your own self-interest. [...]

  19. By FBS Blog » Blog Archive » Tying The Pieces Together on December 24, 2007 at 10:43 am

    [...] Last week, I read a post by Alex Russell called The W3C Can’t Save Us.  The post deals with web browser (HTML, CSS, etc.) standards and how they seem stuck and lacking innovation, but the ideas are instructive to those of us in real estate, too:  It’s clear then that vendors in the market are the ones who deploy new technologies which improve the situation. The W3C has the authority to standardize things, but vendors have all the power when it comes to actually making those things available. Ten years ago, we counted on vendors introducing new and awesome things into the wild and then we yelled at them to go standardize. It mostly worked. Today, we yell at the standards bodies to introduce new and awesome things and yell at the browser vendors to implement them, regardless of how unrealistic they may be. It doesn’t work. See the problem? [...]

  20. By Goodbye 2007 - Happy New Year! - Robert’s talk on December 29, 2007 at 2:43 pm

    [...] Basically, it all started with Opera bringing Microsoft to the European Commission about Internet Explorer and Web standards support (or rather, lack thereof). Then followed by Andy Clarke’s thoughts about the CSS Working Group. It all caught fire by then, topped off with a Molotov cocktail from Alex Russel in The W3C Cannot Save Us. [...]

  21. By yardley.ca / dash » Who makes standards? on December 31, 2007 at 12:50 pm

    [...] I don’t think many of my readers are designers, so many of you haven’t read Alex Russell’s “The W3C Cannot Save Us” yet. I stumbled across it this morning – in short, it argues that because browsers have been taught to follow web standards instead of innovating, and because web standards are created by bureaucratic ‘working groups’ that move quite slowly, innovation has slowed to a crawl. (Innovation being enhancements to CSS / HTML / DOM itself, not what’s being built on them.) Alex also argues that it’s the browsers themselves that have the power to truly innovate – the W3C has authority, but can’t build a rendering engine – and that they need to be encouraged to build non-standard features. In short, a return to the ‘browser wars’. Fascinating. I’m still sorting through the fallout. (See here and here for counterarguments.) [...]

  22. [...] I’ve been reading with great interest the current concern over the W3C process for web standards and the lack of progress being made. Andy Clarke kicked it off by asking the W3C to disband the CSS working group. Alex Russell followed up declaring that the W3C Cannot Save Us. Jeff Croft, playing his usual role of rabble rouser, echoes Alex’s sentiments in an post entitled, “Do We Need a Return to Browser Wars. [...]

  23. [...] Part II: Browser Wars are Back Background: I’ve been reading with great interest the current concern over the W3C process for web standards and the lack of progress being made. Andy Clarke kicked it off by asking the W3C to disband the CSS working group. Alex Russell followed up declaring that the W3C Cannot Save Us. Jeff Croft, playing his usual role of rabble rouser, echoes Alex’s sentiments in an post entitled, “Do We Need a Return to Browser Wars. [...]

  24. [...] There’s been a lot of debate recently about whether the current charter is adequate. The CSS Working Group read those, but in order to focus on the work of the group (rather than meta-issues like its constitution), I don’t want to re-open that discussion here; if you have something that hasn’t already been said about the constitution of the CSS working group, please comment on those discussions. [...]

  25. By Bill Higgins :: RIA weekly podcast and errata on February 18, 2008 at 12:49 pm

    [...] another point in the podcast, I mention a blog post by Alex Russell of Dojo where he talks about standards not saving us and encourages browser vendors [...]

  26. [...] the CSS working group and the W3C in general have come under fire as Opera launches an assault on Microsoft. A counterculture of pro-proprietary technology advocates [...]

  27. [...] cele mai mari companii din lume isi impun punctul de vedere, nu neaparat spre binele clientilor lor [13]. [...]

  28. [...] for not pushing the web forward is both humorously off-target and distressingly common. I’ve written about this before, but fundamentally you can’t blame the W3C for failing to act because it’s not the [...]

  29. By The Universal Desktop mobile edition on July 15, 2008 at 11:25 am

    [...] and really, the W3C wasn’t made for innovation and people are starting to realize that. Alex Russell realizes it, and more [...]

  30. [...] and liberating. But where the spec is in-front of the important implementations…well, I’ve ranted before on the topic. CSS sucks, and the editor of the spec has now written at length of his intent [...]

  31. [...] must be willing to push the web in new ways. Indeed, Alex Russel (of Dojo) comments on this in a post on the failure of the W3C: To get a better future, not only do we need a return to “the browser wars”, we need to applaud [...]

  32. [...] compilation of comments on this topic. I especially found this quote on standards from Dojo’s Alex Russell to be very insightful: we need to applaud and use the hell out of “non-standard” features [...]

  33. [...] of the ECMAScript group and the CSS-WG in the W3C. Plenty of good discussions of what went down here, here, here and here.It’s pretty crazy to me that all the reasons being cited as to why these [...]

  34. [...] another point in the podcast, I mention a blog post by Alex Russell of Dojo where he talks about standards not saving us and encourages browser vendors [...]

  35. By Terug naar jQuery | Nomark on June 6, 2009 at 6:53 am

    [...] approach for associating behavior directly with the element. Alex Russell of Dojo has written about this previously. Second, all effects that can be accomplished by using element attributes can also be accomplished [...]