Conference Wrap Up

I tend not to take a lot of pictures at conferences. Or blog much when I’m at them. In fact, it’s usually not discernible that I’m at a conference except that I’m not doing all the other stuff I normally do. I’m not sure why. I always bring a camera (sometimes more than one), I just don’t use it.

That’s a long way of saying that FOSDEM and ETech were wonderful, but I don’t have a lot of photographic evidence to support that assertion or even to prove that I was really there. It might be to the best though. My ETech demo crashed-and-burned thanks to my foolish assumption that we’d have a working network. Lesson learned.

As this was my first ETech, I was (and still am) in awe of the people I met and how they would even actually talk to me if approached! Whether or not they’ll talk to me after having met me is another question entirely. At ETech I got to meet some users of Dojo and folks who are implementing Comet apps small and large. It drove home my suspicion that we do need a new name; one that will let us discuss old concepts with people who may not yet be familiar with them. The sheer number of “and our product has been doing that since X” replies to one of my previous posts also makes the point better than I ever could. Comet is useful, the solutions to the technical problems are becoming more widely distributed, and a set of patterns for how and when to use Comet will soon emerge.

On that front, Douglas Crockford’s new JSONRequest proposal was announced. Maddeningly, there’s still no link to the various ongoing conversations about it from the document itself, but it does seem worthy of discussion. The provision for duplex communication is particularly interest as it would provide the first known way to do cross-domain Comet without resorting to Flash. James Burke, Dojo’s newest commiter, has great comments on the proposal.

Something I hope to come back to soon is something Bruce Sterling mentioned during his keynote at ETech: that augmenting human intelligence is as better goal than replacing it and that language has been our stumbling block to describing what we should be working toward. My recent introduction by Brad to Doug Engelbart’s seminal work and the realization that Jot is little more than a conceptual descendant of Augment delivered via the web has made these words ring in my head that much more acutely.

2 Comments

  1. Posted March 16, 2006 at 7:06 pm | Permalink

    :-/ This JSON thing just doesn’t sound all that good to me. A stripped down data only JavaScript? What’s wrong with XML? It can be streamed two ways. XMPP/Jabber has proven that. Anonymous SASL also recently got defined as a Jabber Enhancement Proposal.

    You wouldn’t need to go the full XMPP route though. A simple XML stream with an onStanza method would work just fine.

    Though the benefits of full XMPP would be very good: prexisting servers and libraries, extensive documentation, pre-existing open standard, server-to-server communication, a couple of publish/subscribe protocols (MUC and pub/sub), and the kitchen sink.

    Why define something new?

  2. grumpY!
    Posted March 17, 2006 at 10:41 am | Permalink

    nolan – the essential gain of json is the rapid integration of data into your code. yes its the same data as would appear in an xml serialization, just integrated as a native data type.

    that said, i continue to believe the development community underestimates the exploitability of javascript. you have to wonder if eval’ing a string of javascript of potentially dubious origin in your client-side code is a smart thing to do. yes i know the browser domain limitations are designed to keep this data from being “dubious”, but there are well known ways around this. i suspect we will see some amusing XSS hacks via json-style data exchanges at some point.