Infrequently Noted

Alex Russell on browsers, standards, and the process of progress.

Web 2.0 Expo NYC 2010

I've been in NY for the last 2 days, having a great time at the Web 2.0 Expo. Lots of folks have asked me for the slides and the lab zipfile, so there you are.

The browser panel was a lot of fun, thanks again to Ben and Dion's ability to get the right folks and ask the inconvenient questions.

I'm sad that I'll have to miss most of the conference, but it looks like HTML5/CSS3 has legs this year. Can't wait to see what happens.

"But IE 9 Is Just Around The Corner..."

The single most frustrating thing for me as a web developer is the incredible disconnect between day-to-day development and the shiny, shiny stuff showing up in HTML5 and modern browsers. It's made all the more frustrating by bold pronouncements from any (every?) vendor about how much more awesome the web will be thanks to the shiny stuff their upcoming release. The reality for web developers is that those features won't matter on a relevant time-scale. Not your next project. Not your next 5 projects. No, the lag today between new features and when you can use them might as well be measured in geologic time.

How bad is it? The next time you read some tech journalist write about how some new browser version is just around the corner and how it'll make everything better, remember that:

The goal here isn't to drive you to drink, but to call out the dichotomy between when users get features and when developers can address features. For users, things tend to get better as soon as you pick up a new browser. Developers have to wait until every user makes that sort of choice. Said differently, users can benefit from the bleeding edge but developers are beholden to the late adopters.

We should cut tech journalists and users a little slack, though; in a world without adoption friction, the launch of a new feature would translate directly into the sort of "now you can build sites and apps with awesome thing X". The instinct to believe that's how it should be is spot on. But that's observably not how the world works. Don't believe the hype. New browsers alone haven't fixed the problem so far, so what makes us think they will in future? No vendor wants to talk about their last version, but for web developers stuck in the trenches, that all there is to talk about. That's where the pain is, after all, and counting on browser upgrades to fix the problem quickly isn't working well enough. Chrome's aggressive auto-update feature is changing the way things will work in future, but for now we're still stuck in the slow-upgrade dynamic.

We need a Plan B.

Chrome Frame Now Stable!

Exactly a year from the original announcement, we've just launched a stable, ready-for-prime-time version of Chrome Frame (with MSI packages). In addition to heroic work by the whole team on improving GCF stability in the face of poorly written extensions, the biggest change in the Stable release is an astounding improvement in cold start performance (greater than 3x faster in many cases). Faster startups mean that your users wait less to experience a better web and that GCF is appropriate for a wide range of applications that can benefit from HTML5 features. Faster starts also mean that JavaScript heavy sites benefit even more from V8 since the engine can start running your code faster, sooner.

Like Chrome, GCF is on the 6 week release cycle, so expect features like per-user install and even faster starts that are arriving in the Dev Channel to become available quickly.

So what are you waiting for? Say no to legacy baggage and start saying yes to HTML5 and the future by adding the GCF header/tag to your sites and apps. It's nearly zero effort. When you're comfortable, you can add a prompt and free yourself to build sites and apps against only the modern web. When you do, you'll see how fast and productive web development can really be. I promise you won't want to go back. Thanks to GCF, you won't have to.

Wait, What?

People are writing open letters to me? Weird.

I will answer the question, though: started the collaboration of competing toolkits that led to the creation of Dojo. How did you do it!?!?

We did it by pointing out to folks who were working on their own things that the personal effort radius exists (i.e., we'd each pioneered some aspect of the complete toolkit but could never drive it to completion on our own) and that we can get further together than apart. From there, experience took over. This is a notable contrast to what happened after and in different circles where developers maybe didn't have as many KLOC of JS under their belts. With fewer sleepless nights of debugging, optimizing, and porting it's harder to see what's ahead and apply the discipline necessary to avoid it. Part of that accommodation for the future is compromise. And why would anyone want to compromise and work with other people when they don't feel like they really need to? It's fun being the smartest person in the room, and if you don't let other people in as equals, you always are. Swallowing that pride is the big transition. Isn't it always?

I guess what it comes down to is that I didn't try to organize people who didn't see the value in organization: instead, I tried to organize folks whose experience was valuable in terms of personal maturity and not just facility with code. We picked a hard technical problem and an easier social problem knowing that the social aspects were more critical. Did we succeed? Yes, but only in the ways we set out to succeed. Dojo continues and is outstanding for the problems it was designed to solve, but the JS market has strong winner-take-all dynamics and the high-end toolkits like Dojo all compete for a highly-knowledgeable, experienced set of developers who understand not only the problem they have today but also the problems they're going to have in a month or two. We built Dojo with folks like that, for folks like that. I should have known then what I see clearly now, though: the fact that there are relatively few experienced, disciplined developers in the world means that when you build things for them, relatively few people will understand what you've done.

Welcome to the club, Justin.

JavaScript UXO Removal Updated

JavaScript is a lovable language. Real closures, first class functions, incredibly dynamic's a joy when you know it well.

Less experienced JS programmers often feel as though they're waltzing in a minefield, though. At many steps along the path to JS enlightenment everything feels like it's breaking down around you. The lack of block lexical scope sends you on pointless errands, the various OO patterns give you fits as you try to do anything but what's in the examples, and before you know it even the trusty "dot" operator starts looking suspect. What do you mean that this doesn't point to the object I got the function from?

Repairs for some of the others are on the way in ES6 so I want to focus on the badly botched situation regarding "promiscuous this", in particular how ES5 has done us few favors and why we're slated to continue the cavalcade of failure should parts of the language sprout auto-binding.

Here's the problem in 5 lines:

var obj = {
  _counter: 0,
  inc: function() { return this._counter++; },

See the issue? results in a reference to the inc method without any handle or reference to its original context (obj). This is asymmetric with the behavior we see when we directly call methods since in that case the dot operator populates the ThisBinding scope. We can see it clearly when we assign to intermediate variables:

var _counter = 0; // global "_counter", we'll see why later
var inc =;; // 1; // 2
inc(); // 1

Reams have been written on the topic, and ES5's belated and weak answer is to directly transcribe what JS libraries have been doing by providing a bind() method that returns a new function object that carries the correct ThisBinding. Notably, you can't un-bind a bound function object, nor can you treat a bound function as equal to its unbound ancestor. This, then, is just an API formalism around the pattern of using closures to carry the ThisBinding object around:

var bind = function(obj, name) {
  return function() {
     return obj[name].apply(obj, arguments);
// Event handling now looks like:
//   node.addEventListener("click", bind(obj, "inc"));

var inc = bind(obj, "inc");; // 1; // 2 inc(); // 3

inc ===; // false

ES5's syntax is little better but it is built-in and can potentially perform much better:

var inc =;
// In a handler:

Syntax aside, we didn't actually solve the big problems since unbound functions can still exist, meaning we still have to explain to developers that they need to think of the dot operator doing different things based on what charachters happen to come after the thing on the right-hand side of the dot. Worse, when you get a function it can either be strongly-bound (i.e., it breaks the .call(otherThis, ...) convention) or unbound -- potentially executing in the "wrong" ThisBinding. And there's no way to tell which is which.

So what would be better?

It occurs to me that what we need isn't automatic binding for some methods, syntax for easier binding, or even automatic binding for all methods. No, what we really want is weak binding; the ability to retrieve a function object through the dot operator and have it do the right thing until you say otherwise.

We can think of weak binding as adding an annotation about the source object to a reference. Each de-reference via [[Get]] creates a new weak binding which is then used when a function is called. This has the side effect of describing current [[Get]] behavior when calling methods (since the de-reference would carry the binding and execution can be described separately). As a bonus, this gives us the re-bindability that JS seems to imply should be possible thanks to the .call(otherThis) contract:

var o = {
  log: function(){
  msg: "hello, world!",

var o2 = { msg: "howdy, pardner!", };

o.log(); // "hello, world!" o2.log = o.log; // calling log through o2 replaces weak binding o2.log(); // "howdy, pardner!"

But won't this break the entire interwebs!?!?

Maybe not. Hear me out.

We've already seen our pathological case in earlier examples. Here's the node listener use-case again, this time showing us exactly what context is being used for unbound methods:

document.body.addEventListener("click", function(evt) {
  console.log(this == document.body); // true in Chrome and FF today
}, true);

We can think of dispatch of the event calling the anonymous function with explicit ThisBinding, using something like, evt); as the call signature for each registered handler in the capture phase. Now, it's pretty clear that this is whack. DOM dispatch changing the ThisBinding of passed listeners is an incredibly strange side-effect and means that even if we add weak binding, this context doesn't change. At this point though we can clearly talk about the DOM API bug in the context of sane, consistent language behavior. The fact that event listeners won't preserve weak binding and will continue to require something like this is an issue that can be wrestled down in one working group:

       (function(evt) { ... }).bind(otherThis),

The only case I can think of when weak bindings will change program semantics is when unbound method calls in the global object do work on this in a way that is intentional. We have this contrived example from before too, but as you can see, it sure looks like a bug, no?

var _counter = 0; // a.k.a.: "this._counter", a.k.a.: "window._counter"
var obj = {
  _counter: 0,
  inc: function() { return this._counter++; },
var inc =;; // 1; // 2
console.log(obj._counter, this._counter); // 2, 0
inc(); // 1
inc(); // 2
console.log(obj._counter, this._counter); // 2, 2

If this turns out to be a problem in real code, we can just hide weak bindings behind some use directive.

Weak binding now gives us a middle ground: functions that are passed to non-pathological callback systems "do the right thing", most functions that would otherwise need to have been bound explicitly can Just Work (and can be rebound to boot), and the wonky [[Get]] vs. [[Call]] behavior of the dot operator is resolved in a tidy way. One less bit of unexploded ordinance removed.

So the question now is: why won't this work? TC39 members, what's to keep us from doing this in ES6?

Update: Mark Miller flags what looks to be a critical flaw:

var obj = {
  callbacks: [],
  register: function(func) {
    for (var i = 0, i < this.callbacks.length; i++) {
obj.register(; // Does the wrong thing!

The problem here is our call into each of the callback functions which still execute in the scope of the wrong object. This means that legacy code still does what it always did, but that's just as broken as it was. We'd still need new syntax to make things safe. Ugg.

Older Posts

Newer Posts