Adam Bosworth has a blog post entitled AJAX reconsidered that hits right at the heart of some questions I've been asking myself about the renewed interest in using DHTML and server callbacks via XMLHttpRequest to build website applications. Adam writes
I've been thinking about why Ajax is taking off these days and creating great excitement when, at the time we originally built it in 1997 (DHTML) and 1997 (the XML over HTTP control) it had almost no take up. ...First, the applications that are taking off today in Ajax aren't customer support applications per se. They are more personal applications like mail or maps or schedules which often are used daily. Also people are a lot more familiar with the web and so slowly extending the idiom for things like expand/collapse is a lot less threatening than it was then. Google Maps for example uses panning to move around the map and people seem to love it. Secondly, the physics didn't work in 1997. A lot of Ajax applications have a lot of script (often 10 or 20,000 lines) and without broadband, the download of this can be extremely painful. With broadband and standard tricks for compressing the script, it is a breeze. Even if you could download this much script in 1997, it ran too slowly. Javascript wasn't fast enough to respond in real time to user actions let alone to fetch some related data over HTTP. But Moore's law has come to its rescue and what was sluggish in 1997 is often lightning quick today. Finally, in 1997 or even in 1999 there wasn't a practical way to write these applications to run on all browsers. Today, with work, this is doable. It would be nice if the same code ran identically on Firefox, IE, Opera, and Safari, and in fact, even when it does, it doesn't run optimally on all of them requiring some custom coding for each one. This isn't ideal, but it is manageable.
I've been thinking about why Ajax is taking off these days and creating great excitement when, at the time we originally built it in 1997 (DHTML) and 1997 (the XML over HTTP control) it had almost no take up. ...First, the applications that are taking off today in Ajax aren't customer support applications per se. They are more personal applications like mail or maps or schedules which often are used daily. Also people are a lot more familiar with the web and so slowly extending the idiom for things like expand/collapse is a lot less threatening than it was then. Google Maps for example uses panning to move around the map and people seem to love it.
Secondly, the physics didn't work in 1997. A lot of Ajax applications have a lot of script (often 10 or 20,000 lines) and without broadband, the download of this can be extremely painful. With broadband and standard tricks for compressing the script, it is a breeze. Even if you could download this much script in 1997, it ran too slowly. Javascript wasn't fast enough to respond in real time to user actions let alone to fetch some related data over HTTP. But Moore's law has come to its rescue and what was sluggish in 1997 is often lightning quick today.
Finally, in 1997 or even in 1999 there wasn't a practical way to write these applications to run on all browsers. Today, with work, this is doable. It would be nice if the same code ran identically on Firefox, IE, Opera, and Safari, and in fact, even when it does, it doesn't run optimally on all of them requiring some custom coding for each one. This isn't ideal, but it is manageable.
I find the last point particularly interesting. If Web browsers such as Firefox had not cloned Microsoft's proprietary APIs in a way made it easy to write what were formerly IE specific applications in a cross-browser manner then AJAX wouldn't be the hip buzzword du jour. This brings me to Microsoft's next generation of technologies for building rich internet applications; Avalon and XAML.
A few months ago, C|Net ran an article entitled Will AJAX help Google Clean Up? In the article the following statement was attributed to a Microsoft representative
"It's a little depressing that developers are just now wrapping their heads around these things we shipped in the late 20th century," said Charles Fitzgerald, Microsoft's general manager for platform technologies. "But XAML is in a whole other class. This other stuff is very kludgy, very hard to debug. We've seen some pretty impressive hacks, but if you look at what XAML starts to solve, it's a major, major step up."
Based on how adoption of DHTML/AJAX occured over the past few years I suspect that Avalon/XAML will follow a similar path since the initial conditions are similar. If I am correct then even if Avalon/XAML is a superior technology to DHTML/AJAX (which I believe to be the case) it will likely be shunned on the Web due to lack of cross-browser interoperability but may flourish within homogenous intranets. This shunning will continue until suitable clones for the functionality of Avalon/XAML appears for other browsers. In which case as soon as some highly visible pragmatist adopts the technology then it will become acceptable. However it is unclear to me that cloning XAML/Avalon is really feasible especially if the technology is evolved at a regular pace as opposed to being let languish as Microsoft's DHTML/AJAX technologies have been. This would mean that Avalon/XAML would primarily be an intranet technology used by internal businesses applications and some early adopter websites as was the case with DHTML/AJAX. The $64,000 question for me is whether this is a desirable state of affairs for Microsoft and if not what should be done differently to prevent it?
Of course, this is all idle speculation on my part as I procrastinate instead of working on a whitepaper for work.
Disclaimer: The aforementioned statements are my opinions and do not represent the intentions, thoughts, plans or strategies of my employer.