I don't write much about politics here, but the amnesty-for-telcos language which is being fought by the EFF really has my goat. The whole robo-fax-as-advocacy thing isn't really my style so what follows is the letter I sent to Senator Feinstein today after finding that her San Francisco office's voicemail box is full and that her Washington office isn't staffed on Saturdays either.
My name is Alex Russell, I'm a software engineer and a constituent of yours in the San Francisco district. I vote in every single election, federal, state, and local. I'm not "politically active" per sae, I don't consider myself a partisan for any party nor do I ever vote a straight party ticket. I just want to see deliberative government which considers the needs of voters seriously. I try to stay on top of issues, study seriously for elections, and come to reasoned positions about the matters before me as a citizen. It's strange then that this may be only the second time I've ever written my senator (previously I believe I wrote my senator in Indiana before I was of voting age...with predictable results).
The reason I'm writing is my incredulousness at your apparent support of the proposed language granting telco's immunity from prosecution for illegal acts taken on behalf of the executive branch (S. 2248, FISA Amendments Act of 2007). You take many policy stands which I disagree with, but this is beyond the pale. The current administration has run roughshod over the mutual respect necessary for our co-equal branches of government to function effectively on behalf of the people. I hardly need to cite examples of executive over-reach. They are before you and the committees on which you serve in the form of testimony nearly every week.
It is troubling then that your position on warrant-less wiretapping should be so blasé, so deferential in the face of a seemingly explicit policy from the executive which asserts that it is immune from oversight. The existing FISA statutes (before this years revisions) provide broad leeway to the executive and little scrutiny. This default policy of the FISA court in favor of lessened oversight then, in my opinion, casts it as the limit...the furthest down the path of creating a surveillance society that a democracy which is beholden to the rule of law can tolerate. What is being proposed in S. 2248 is something entirely different in character.
It is anathema to the concept of equality of the judiciary with the legislative and executive branches to suggest that when the legislative branch entices firms to act illegally on its behalf, shrouded by secrecy, that the people have no right to redress their wrongs through the judiciary when the legislative branch is too cowed or blind to hold the executive to account. This is exactly what is being proposed. It is one thing for Democrats in congress to fail to act with regards to what is plainly illegal behavior by the executive. I understand and appreciate the concerns and constraints which lead you and your colleagues to ignore the will of the people in the short term. But I cannot fathom a strong case for stripping the judiciary of its oversight role as well. Please, I implore you, do not cave to this.
I fully agree with those who suggest that it may be sub-optimal for the proxies of the administration to take a fall for the administration's illegal acts, but I do not see where the case of civil society and the rule of law can turn when redress through the courts is removed as the backstop on the slippery slope of executive power. The fourth estate has already failed us here and congress has been unwilling to strongly challenge this illegal behavior. If the congress is to have a hope of addressing this behavior on a legislative basis in the future, someone needs to be able to shine sunlight into the illegal activities of the administration. If congress is unwilling to take it up, then at least allow the courts their right.
Please, Senator Feinstein, denounce, work against, and vote against the proposed language which grants amnesty to secrecy, thereby giving anyone who can wield the language of security and the privilege of secrecy the force of law to do as they will.
As you may have seen other places, Dojo 1.0 was released yesterday. Most reports on IRC and on the forums indicate that the transition for 0.9-based apps has been smooth sailing. I anticipate we'll be following up with 1.0.1 very shortly as we tamp down the issues that inevitably come up with such a large release, but so far so good.
Following up shortly on the heels of the release, Dylan screen shots of Dojo charting running on the iPhone. It's a testament to the architecture that Eugene and Kun put together for dojox.gfx that Chris Mitchell's awesome canvas renderer was able to slot right in to make this possible. For anyone counting, that now makes 4 independent renderers for the awesome shape-oriented GFX API: SVG, canvas, VML, and Silverlight. Portable, non-proprietary 2D graphics in a browser are really here.
Just hours after that, James Burke announced that 1.0 is available on AOL's CDN, meaning that you don't even have to download 1.0 to try it out. Just point to the right URL to include Dojo and you're up-and-running. Sweet.
Bryan Forbes jumped in with a beautiful Grid example today, and he tells me that it's going to be a recurring feature over on the SitePen blog which you'll also be able to catch over on Planet Dojo.
We've got more up our sleeves, and I can't wait to start talking more about the awesome features we've been busy baking into 1.0. Dojo is finally more than the sum of its well-designed parts. The team that put this release out made a huge gamble in January of this year, and 1.0 is proof that it has more than paid off. One of the major decisions was to ensure that
dojo.js could be used on a stand-alone basis much the way other libraries tend to be. The result is a single file which is a tiny 23K on the wire, edge cached, and packed with all the utilities you're really going to need for "low level" Ajax. We haven't seen too many people using it stand-alone yet, but I expect to see a lot more of that soon.
dojo.js is amazing infrastructure for progressive enhancement all by itself. From animations that handle colors to JSON support baked in to amazingly robust event normalization,
dojo.js is industrial-strength plumbing and Dijit and DojoX are taking full use of it.
The next six months are going to be exciting.
Anyone with an RSS reader and a passing interest in languages or browser technology has by now caught a whiff of the strange odor coming from the ES4 vs. ES3(.1) debate. Before I dig in, I should mention by way of disclaimer that I'm an invited "expert" (har!) to the TG1 working group. Take the rest of this post with the appropriately sized dose of granulated sodium.
A lot of dirty laundry is getting aired and I'm afraid I haven't been immune from hyperbole and inaccuracies in the process, so charged and furious is the debate. There's been procedural wrangling, odd missives, emphatic rebuttals, painful comment threads and expositions on theories of how the web should evolve. Even a bit of popcorn sellin...er...press coverage.
We're starting to miss the plot a bit.
Wrapped up in the question of "will Microsoft implement ES4?" is implicitly "what will Microsoft implement? when?". Answers to this question aren't about features so much as they are about trust. Reflected in the comments of these threads seem to me to be two primary voices, roughly paraphrased as:
- "The web is evolving again because of Ajax and libraries. If browsers start changing, it'll be chaos. Stability enabled this round of growth. Don't blow it."
- "We must have browser evolution to fix the huge problems with the existing state of the art. Give us big new stuff ASAP"
We've seen both of these voices before, particularly in regards to W3C activities. Some of my previous comments on that topic made the debate louder without (I'm afraid) providing much perspective about how the community can effectively advocate for fixes. What's good about the ES4 debate (in contrast w/ my CSS ramblings) is that they are scoped to particular implementers. Insofar as advocacy makes a difference, lobbying a working group which is moribund is nearly useless. They have authority but no power. Power in these situations always lies with implementers, aka browser vendors. The CSS discussions are weird and airy because there's no real skin in the game. Swatting at the W3C is just so much shadow boxing. Asking browser vendors to take risks, to show some good old-fashioned technical leadership and put some of their theories about how to evolve into their competing implementations....now that's got legs.
The first perspective, I think, is borne of a fear of loss mixed with a big pinch of "IE4 vs. NN4 sucked" (queue involuntary twitching). The web won. The web is strong. The web's strength is due in large part to the fact that (as never before) it "is" a single thing in the minds of developers. I'm incredibly sympathetic to this viewpoint, and the argument that major changes in HTML and other fundamental web technologies creates opportunity for closed technologies to sell the "but we have a uniform deployment environment" argument does hold some water. But not enough.
Closed-platform vendors have always been selling that advantage, in part because it may be their only sustainable advantage over mass ubiquity and real competition. What advocates of the "go slow" position rightly are pointing out is that when faced with what to do next, the pace of change suggests that the deployed browser population is set in stone. It has always looked this way, though. From the days when Netscape 3.2 kept us from using DHTML in any real form to the years when NN4 just couldn't die fast enough all the way through the new deadpool taking bets around the demise of IE 6, progress on the web has always been gated by deployed systems. Yet somehow we've got most of HTML 4, CSS 2, and ES3 in a semi-portable form. Clearly the web is robust enough to handle moderate doses of divergent implementations. Progress neither asks for permission nor arrives with a press release. It's made one browser upgrade or replacement at a time.
Both of the perspectives on where to go from here are veiled ways of asking the question "who can make it better for me?". It turns out that the Ajax toolkits matter (if they matter at all) because they can give us what we want without waiting for browser vendors to field implementations; for some value of "what we want". That might seem like the future, then. Just ask your friendly Ajax toolkit vendor for something you need and wait for it to be implemented. Would that it were so simple.
Ajax toolkits (like Dojo) represent "cheap" agreement. Instead of the browsers duking it out on features and subsequently unifying APIs at a standards body, toolkits can paper over differences to some extent and provide a unified front, but the cost to doing so is massive. Not only do they need to be shipped down the wire (a constraint that mangles their design every step of the way), they introduce their own bugs, weirdisms, and inefficiencies. In a functioning marketplace, the feature set of ajax toolkits would be changing very quickly. As new versions of all browsers are released, the toolkits would keep code only so long as old versions of browsers for which they bridged functionality were in play. The existing features would be gone in a year or two, replaced by native APIs and tags, and new features would be added to help lessen the blow as the browser makers attempted to figure out who was going to "win" on a particular way of implementing this or that. In any event, it should be obvious with even cursory introspection that the toolkits can only approach so much functionality at once (until they're baked in or preferentially cached by the browsers, that is), and that functionality can really only cover basic capabilities which are ubiquitous or are so close that a bit of script can push it over the edge. We may always have the toolkits, but arguments that somehow they'll dig us out of this hole don't stand up to inspection. We toolkit authors are combing the beach of features from the various browsers, desperately trying to construct a full set of shiny shells of any type. Small irregularities we can polish out, but if one thing is square and the other is round, there's just no way we can make one into the other without spending more time, effort, and money than it's worth.
So now we've got the shape of a solution to the problem: toolkits can act as our canny observers of ubiquity, constantly combing for ways to get us to a robust baseline, iterating faster than the entire ecosystem might otherwise allow. But we've been over this beach a lot already. We're out of shiny shells that look similar, and have been for a long while. At this point we're reduced to selling our own compositions instead of looking for the natural beauty which attracted us to this beach in the first place. We need a good storm. We need new features to mine, and we need to be able to drop all the code which we're monkey-patching old versions of IE with.
To be bleedingly obvious about it, we need the browser vendors to ship. Not once, but over and over again, on a predictable schedule, and with some actual competition and risk taking. Hiding behind W3C (or even ECMA) committees instead of shipping new stuff is a tactic which the web community at large simply won't accept any more.
New browsers (aka, new features) are the only way we get new shells on this beach. Sure, some of them are ugly. They'll get left, and in time, washed back out to sea. So be it. If we get shiny, pretty new ones tomorrow, the ugly ones don't matter. But we have to be able to trust that we will. The idea that the web should be the same tomorrow as it was yesterday is ludicrous. "Don't break the web" simply cannot be code for "if it changes, that means it's broken". It would be laughable on the face of it if it were. It may have been changing imperceptibly of late, but it has always been changing. Likewise, things which make it uninhabitable for large quantities of content can't be passed off as "good" just because they're change.
What that all adds up to is that inciting browser makers to iterative action, spec or not, is the only way we actually get new stuff into the fabric of the web. If we're going to really ride this "open web" thing, we (the development community) need to come to grips with the fact that we're never going to get new features in one fell swoop. We've always gotten them in an N+2 fashion, and that's not going to change, no matter who's got the majority market share today.
What we should not ever settle for, ever again, is for any of the vendors we depend critically on to go radio silent. The web looses not when other platforms are better (they always have been, always will be), but when we lose confidence that the market mechanism of competition is functioning. When we can't count on new features at predictable mile markers, there's no change to treat as opportunity. It doesn't even matter so much how big the changes are. We just need to know that they're coming and will be coming again in the future. When there's no change (aka: opportunity), the web dies a slow, wrenching death as more canny things which can provide the promise of a future that's not the same as the present steal mind share, first for the high end, and then all the way down the spectrum.
So, as I've been saying in my public talks of late, demand a public plan from the browser vendors. All of them. Asking for standards conformance isn't an answer, but combined with asking for a commitment to future versions it is the basis for resuscitating this market. We can start with: "When will your next beta or alpha be available? What about the version after that? Is your organization standardizing the new stuff you added in the last stable release? Where?"
Those questions don't speak to features, the speak to futures. Features without a future don't matter.