DDJ on “The Return of the Desktop”

Okay, I swear I’m not on the DDJ payroll, but this article caught my eye immediately. Michael Swaine has been on a roll lately but I think this one just drips with significance for the GIS community.

Over the past 10 years, as everyone has run screaming from the desktop, I’ve been a little mystified as to why it was considered a good thing to reduce a CPU more powerful than everything NASA had in 1969 to a mere vehicle for a browser. The browser-based model reduced our computers to really cool-looking equivalents of a VT220 so it’s nice to see that the market is starting to gain back a little sanity.

I will readily admit that the browser model has its advantages. Application deployment is a snap compared to what it takes to ensuring desktops are in synch. Anyone who’s had to deal with NMCI will vouch for that. In addition, there’s the matter of targeting the desktop OS. That can be a pain for a desktop app (Windows/UNIX/Linux?, Flavor of Linux?, 32bit or 64bit?, ugh). That hasn’t changed much over the years. I remember running ArcView under Win32s on Windows 3.11 also testing Installshield builds against WinNT 4, Win2K and Win9x for MapObjects apps. So, yeah, the web app model is definitely attractive. However, the trend has also led to the need for really big bandwidth, multi-socket/multi-core servers and a loss of control at the desktop for the user. If the developer of the server app/service didn’t think of it and the sysadmin of the server doesn’t want you to have it, you’re kind of outta luck. Also, Moore’s Law has been giving us faster, better CPUs but we’ve been asking them to do less and less.

The article gives a few examples of products that are making use of local resources instead merely relying on the server for everything. In the article, things like Dojo, Gears and Silverlight are discussed. We’re already seeing some of this trend in our market with the advent of the various virtual globes (Google Earth, NASA World Wind, ArcGIS Explorer, etc.). These apps tap into very data-rich servers or services but use local resources for tasks such as tile caching. I think GIS is an ideal place to push the boundaries of this model due to the resource-intensive nature of some geospatial processes.

This is an area where, with a little work, the ESRI product line could shine. With ArcGIS Desktop, Engine and Server, the same objects can potentially reside on the server as well as the desktop. It would be interesting to see these objects be able to communicate together in such a way as distribute processing load between themselves. Of course, any of the technologies mentioned in the article could serve as a basis to do a similar thing with non-ESRI technologies or for users with only a browser. Not that such an approach would be easy or trivial, but it would certainly be worthwhile.