LHB: Same Story, Different Tune

I’m not sure how I missed it for so long, but I’m now reading every word of the Linux Hater’s Blog. It’s beautiful catharsis, and I say that as someone who used to serve as President of a university LUG and who writes open source software for a living.

Like the author of LHB, I’ve written my share of low-level Linux stuff and shudder in horror and how backward the linux desktop still is. X11 is an abomination and configuring sound or wifi or some new-ish device…well, I still have those scars. It wasn’t that long ago that I was hacking together bits of kernel patches to get an early firewire iPod (a gift from Jennifer) working on my SuSE desktop. And I was using SuSE because it usually required the least futzing of the then-available distros. Even when Linux was my desktop OS of choice – I had since climbed down from my brief flirtation with OpenBSD – it was blindingly clear to me that building my own packages was a fool’s game. My time is simply worth more than that. Perhaps the most cogent point I took away from reading the LHB archives is that building professional code for the “Linux Platform” is nearly impossible, not because it can’t be done, but because it costs too damned much to justify the costs. The Linux crowd seems to mis-value the immediate vs. long term costs of choice. By failing to provide a binary-friendly environment, the Linux world creates conditions only friendly to Open Source software, crippling their platform and robbing it of the capital investment that would allow it to truly compete. Many people who suffer through some Linux distro as their desktop environment no doubt see this as a good thing. I, however, need to get some work done. The LHB cogently lays out why everyone else on the planet whose time is worth more than what Amazon charges for a CPU-hour on EC2 similarly dismisses Linux for anything but servers.

The almost religious belief that choice is good ignores the inability of most people (myself included) to fully judge the long-run costs of any given technology decision. Oddly, the web browser world seems stuck in a similar position. Choice in browsers is promoted like some sort of panacea, when in fact our big problem isn’t choice, it’s that browsers simply can’t natively attempt the feats we need them to accomplish. Smart people don’t replace their browsers, they use what works until it doesn’t any more. No wonder it’s taken Firefox so long to gain market share. Like linux distros, browsers evolve in hodge-podge was, never quite tracing a straight line toward real progress. The refrain of “standards will save us” seems to ignore the reality that, like the LSB, the existing W3C standards are absolutely insufficient to address the problems at-hand. Both CSS3 and HTML5 are nice first-stabs, but they don’t get us “there”. For Linux, the LHB points out that there’s zero reason to not ship “the same bits”, and for the web, the issue is that content can’t tell the browser “no, really, use that renderer”. Interestingly, Microsoft tried to convince the world that we should version our content and the standards zealots just shot them down without really considering the consequences. Instead of making the world safe for a better web, the HTML standards geeks instead did the most powerful thing they could do to prevent it from materializing. In essence, they preserved “choice” at the expense of utility. What a waste. Seriously, if these are the deep thinkers on “our team”, why not go just use Flash to build everything?

Many people have gotten worked up about Microsoft’s role in killing Netscape (although they tend to minimize Netscape’s role in its own demise), but ISTM that the real long-term harm done here has been to remove the renderer as a profit center. Once Microsoft set the price of the browser at “free” they effectively killed browser evolution as part of anything but an OS-based platform play (in part, to preserve their existing OS-based platform play). Interestingly, then, web-based services have routed around the difficulties of the platform to date to deliver apps that seemed well out of reach of HTML 4.01 as implemented by IE 6, but well, we’re a plucky lot, aren’t we? The most progress being made right now seems to be coming from a large software vendor in Cupertino with an OS-based platform play that absolutely needs the web to sparkle in order to drive adoption of their OS and hardware.

Like Linux, the web will probably lurch forward this way for the forseeable future. The standards zealots smacked down the IE team so hard on content versioning that I don’t think anyone else will have the testicular fortitude to try again for a good while. It’s down to the whole “vision thing”, and the web standards crowd doesn’t seem to have any. It’s about time someone took the punch bowl away from them once and for all. The open web needs real progress too badly to stall any longer.


  1. joe
    Posted July 6, 2008 at 10:22 am | Permalink


    I take it you haven’t tried (K)Ubuntu then? I have found it to be a complete joy over a PC (and Mac seems a little version of M$ with regards to only letting you do things their way, though with a nice user experience!)

    Doesn’t the web standards crowd at least have the standards vision? Write once, kinda render anywhere?

  2. Bob
    Posted July 6, 2008 at 12:00 pm | Permalink

    I agree with your call on HTML standards. The W3C seems to be adding polish to a turd, rather than confessing HTML is a version 1 that needs upgrading. Where’s the innovation in standardsland?

    It’ll push people to Flash, to Silverlight (nah), well, to /something/.

    PS. AFAIK, MS will still implement the version tagging.

  3. Posted July 7, 2008 at 1:32 pm | Permalink


    I have indeed tried Ubuntu, and Kubuntu is simply as clear a demonstration of the false economy of choice as I can possibly imagine. I love KDE, it was my desktop for a long, long time, but the fact that Ubuntu has (essentially) forked to install a competitor to the mainline desktop development effort is sheer idiocy. How should a new user choose? How should Canonical talk to ISV’s about developing for “their platform” when even the desktop UI is “in play”? If the Linux world simply can’t understand that forks like Kubuntu don’t actually provide any long-term “benefit” other than “choice”, then it truly is doomed to irrelevance. An efficient market relies on competition and Linux on the desktop has it in spades. OS X and Windows all work better, support more software that real people want to use, and have much more consistent desktop and interaction metaphors. That is the competition for the Ubuntu stack…not KDE vs. Gnome. Those are sideshows and until the Linux world can get it together and learn the discipline of not forking, the current passive-aggressive mode of technical conflict resolution will reign and the Linux desktop will continue to rob itself of the momentum needed to effectively compete with the real marketplace.

    Sadly, Ubuntu suffers the same fundamental issues that my usage of SuSE 5-10 did and Debian and RedHat before them. They all show the promise of being something I want to use, and the initial rush of discovery with all of them is surely exhilarating. But beyond that, they become a gigantic chore. Saying that I can hack on something (and even making it kinda easy) is no substitute for making a system I want to use. I takes some perspective to realize that even though you may be fully capable (on a theoretical level) of solving all of the problems presented by the desktop, there’s no way that an individual can do it on their own. I’ve been taught to fish, and now that I have, I want a deep-sea trawler, not the small canoe I learned on. I can make a canoe myself, but I’m entirely willing to pay to buy or rent the trawler when my livelihood is on the line, and WRT my choices of tools it most assuredly is. It makes the most sense for me as an ethical person to put my money where my mouth is and go build good things in a way that’s best for everyone in the best way practical rather than fretting about whether the art of canoe making will suffer due to my choice not to build canoes and do subsistence fishing for a living. This is the false-economy of choice writ large: it is absolutely better for me personally and for society at large to encourage the best long-term allocation of capital. Insofar as Open Source can represent a hedge against the apparent default of monopoly creation in the software world, it helps allocate capital efficiently. But unless it competes for that capital effectively in the short-run by making something that people actually want to use, it doesn’t serve as effective competitive backstop (e.g., driving the price of the commodity to zero). Instead, it represents another inefficient allocation of capital. I fear that Linux distros (and browsers) are now stuck in such a mode.

    At least Canonical seems to have enough sense to tell “the community” to go stuff it on a semi-occasional basis, particularly when it comes to drivers. Yes, it would be good if drivers were open, but to what end? What huge community of people will show up to write high-quality drivers without compensation? You can’t run back to the usual tropes with me on this argument: I know how this stuff actually works and what actually happens, namely that one person makes a huge difference when the conditions are right. Sometimes that’s on a volunteer basis, but more often it has to do with commercial interest and a smart guy in a cube somewhere in Texas or China getting paid to make a vendors new device “work with Linux” because some other customer of theirs is running Linux (to reduce costs in their embedded platform). It’s then pretty confusing how the Free Software folks work themselves into spasms of moralism around driver development but luckily Canonical is run as a business and therefore they can ignore much of the hue and cry.

    Ubuntu won’t be replacing OS X on my desktop for the foreseeable future because while getting to windows-esque levels of driver support is nearly accomplished, the rest of the stack still sucks, and there’s not a lot that Canonical can do about it. They’re not big enough (yet). Real integration with hardware is a fantasy. Media? Hrm. Few (if any) of the beautiful pieces of commercial software that I use on a daily basis run there. Geez, I mean, look at how long and hard the Linux world had to flog Adobe just to get Flash support? Lets not bring up Photoshop (and no, The Gimp is NOT an adequate replacement). Let me put it to you this way: when I shut the lid on my MacBook Pro, it goes to sleep immediately. When I open the lid, it starts back up. Always. I haven’t tweaked any drivers, futzed with ACPI settings, or built my own kernel to make this happen. And all of my Mac laptops have worked this way, across 2 processor architectures and many, many interim changes and component updates to top-of-the-line hardware.

    As for the whole “vision thing”, there’s no vision in standards. Never has been, never will be. Standards are a tactic, not a strategy. Hell, standards aren’t even a *goal*. Vision is about goals. Standards are a way to get everyone in the room to agree to do something that someone else wants them to do but which they themselves have no intrinsic interest in doing. The thing that they’re there to agree on? Now that might be a goal or perhaps a strategy. But getting people into the room to make commitments and then getting them to follow up on those commitements? That’s a tactic. Confusing it with a strategy or a goal is the chief sin of the web standards crowd, and we all pay for it dearly.


  4. Posted July 7, 2008 at 2:02 pm | Permalink


    You’re right that MSFT will still do version tagging, but it’s essentially neutered. The basic problem is that content which is served up still can’t know what environment it’s going to show up in, it can only strongly suspect as much. Now, the IE team didn’t lay out a case that strong versioning is what they were trying to do (and I have reason to suspect they weren’t thinking that far ahead), but by locking things to one rendering mode unless and until content opted-out would have set the stage for a world when we could have done more more “staged” upgrades to the web and that has the power to enable faster renderer evolution.

    Alas, it’s not to be. Default-renderers are still prevented from taking any real risks which might evolve things faster by the weight of all existing content (which they’re still expected to handle in backward-compatible ways).

    Thanks for nothing, WaSP.

  5. joe
    Posted July 7, 2008 at 3:08 pm | Permalink

    Thanks for responding, but I think you took my questions a bit too personally judging by the tone of your response! I was just saying I found Kubuntu to be great for me (I have a PC and OSX at home too).

    By saying that I didn’t mean to say that is the only way (I certainly accept that OSX is more polished — although when I close the lid on my laptop, Kubuntu goes into standby quite nicely — but I take your point; it doesn’t consistently for everyone, and even more general than that, the forking of everything can be annoying).

    Or maybe I just hit a nerve :)

    My standards question was just wondering out aloud — I aint one of them — I certainly really like the idea of standards, but agree with you; it is more of a strategy or a means, not the ends in itself.

    (And I agree with you also about the misleading argument of choice; at some points various industries become inefficient for society because choice is excessive; duplication at a certain level leads to wasted use of resources and choice does not help people make informed rationale decisions; it just confuses!)

  6. Posted July 7, 2008 at 6:10 pm | Permalink

    Hey Joe:

    I didn’t take it personally, but I do remember being in your shoes. Oh the times I’ve wished I could take it back. I remember saying something along the lines of “oh? you had that problem? you should totally try Distro X. It’s way better”…only to be the guy on the other end of the line when it wasn’t, at which point it was usually too late for my poor victim. I owe everyone I helped convert to Linux an apology and a copy of OS X, I’m afraid.


  7. Posted July 10, 2008 at 12:29 am | Permalink

    If there’s anything we could add to HTML5 to make it more than insufficient, please do let me know (ian@hixie.ch), or post on one of the HTML5 mailing lists. http://whatwg.org/mailing-list#specs

  8. Adam
    Posted July 10, 2008 at 4:39 pm | Permalink

    “What huge community of people will show up to write high-quality drivers without compensation? You can’t run back to the usual tropes with me on this argument: I know how this stuff actually works and what actually happens, namely that one person makes a huge difference when the conditions are right. Sometimes that’s on a volunteer basis, but more often it has to do with commercial interest and a smart guy in a cube somewhere in Texas or China getting paid to make a vendors new device “work with Linux” because some other customer of theirs is running Linux”

    This is a bogus argument. The huge community of people have already shown up and have proven they are willing and able to write high quality drivers. What is keeping the high quality open source drivers from being written is the hardware vendors, who do not welcome wider platform support, greater community involvement, more branding recognition and greater compatibility because of a concentration on short term profits, enforced with patent threats and copyright lawyers. It’s not the time of the developer that costs the money to reverse engineer, it’s that reverse engineering is commonly seen as something that is (or should be) illegal, and the developers do not have the time to deal with the ramification of that. In their efforts to differentiate themselves on what amounts to commodity hardware, they don’t follow standards that are already in place, requiring a manufacturer provided driver.

3 Trackbacks

  1. […] I just came across this very well-written and thought-provoking piece by Alex Russell on the similarities between Linux and the open web. […]

  2. […] of a swarm of WaSPs, or worse. Attempts to even begin to lay the groundwork for such a mechanism have been shot down forcefully by may folks who, like Paul, view “fixing the web” as the W3C’s […]

  3. […] Much of the conversation has revolved around how we maintain a user-centered focus as a project as we work to attract volunteer developers. (A challenge all consumer-facing open source projects face.) […]