When I was in college, I had a psychology professor who posited that you could train a cat (a dodgy proposition at best) to take a circuitous route to its food bowl by only rewarding that behavior. He was clearly a behaviorist and was convinced that you could completely condition the instinct to go straight to the food bowl out of the cat. To my knowledge, this professor did not own a cat and never attempted to test his assertion.
I was reminded of this after reading my friend Atanas Entchev’s post in reaction to the PostGISDay hangout panel discussion. In his post, Atanas describes difficulty in convincing customers to consider open-source geospatial tools. These customers and prospects are comfortable with their proprietary tools and associated workflows and are reluctant to consider switching. I have encountered this attitude many times myself so I take no issue with the observation. Barriers to exit are real considerations, regardless of the new technology being considered. Organizations align themselves around their tools to achieve maximum efficiency with them. I discussed these issues at a talk I gave last year to the New Jersey Geospatial Forum about how organizations can extend their existing geospatial technology investments with open-source technologies. These issues are very real for any organization with a mature, extended investment in a particular technology stack.
Atanas went on to liken the attitude to that with which some people view alternative medicine and I can see his point. Traditional GIS has set itself apart from the rest of the technology world for so long that users are generally conditioned to believe that GIS workflows should involve a series of Rube Goldberg machinations involving file-based data sets, some proprietary scripting, and possibly some application-level business logic to relate and/or join data as necessary. This has taken various forms over the years but diagrams of those workflows tend to look the same.
Standing in contrast to such things, PostGIS looks alien, or “alternative.” In truth, it is not “alternative” but rather “standard.” As an example, here is a map I produced a few weeks agoshowing the average ages of bridges by county. (I am not a cartographer.) It is a simple aggregation of the National Bridge Inventory, which consists of tens of thousands of records by county (3100-ish records). All of the data processing was done in PostgreSQL/PostGIS using nothing more exotic than SQL aggregate functions and some joins. None of the operations took longer than 6 seconds on my very pedestrian laptop. When I was done, I used QGIS to play with visualization and then dump out the static GeoJSON for use in Leaflet.
For my many friends who are regular users of PostGIS, this is nothing exotic. For some of my friends who regularly use commercial tools, this is interesting but not earth-shattering. But for a large portion of my friends who are comfortable with traditional tools and workflows, the time-to-market for this effort (35 minutes from the time I downloaded the NBI to the time I pushed the map to GitHub) has them taking notice. This entire workflow involved SQL extended with OGC-compliant spatial objects. (Side note: I have been hard on OGC’s web efforts but the Simple Features Specification has been a quiet workhorse across the geospatial industry for over a decade. It’s a good example of the benefit that well-designed standards can provide.) The map is being served from static content over simple HTTP with some client-side Javascript handling visualization. No heavy APIs or middleware involved or needed. The QGIS part was really necessitated by own cartographic limitations, but I could have fully generated the GeoJSON from SQL as well.
This example is fairly simplistic but I have good friends that are using PostGIS, and nothing more, to perform analyses and produce results for decision makers while sitting in meetings. This type of turnaround is standard in other market segments and the geospatial industry should expect nothing less. It requires nothing more than a strong foundation in SQL, mastery of spatial processes, and detailed understanding of your own data.
So I have come to realize that the mainstream GIS community has become very much like my professor’s theoretical cat; conditioned to take the long way to the end result when more direct paths are clearly available. What’s more, they have become conditioned to think of such approaches as normal. Geospatial analytical operations can be very complex and the approaches to performing them were, in the past, necessarily convoluted due to the lack of understanding of spatial data types and operations within mainstream platforms. Those barriers have been rapidly disappearing over the past decade or so, but the user community has been slow to let go of its comfort with familiar tools and convoluted approaches. As I stated above, organizational barriers to exit are real considerations, but the inherent efficiencies available in modern geospatial tools such as PostGIS make the transition worth the effort.
I love open source, but most of my clients and prospects imagine open source to be like this open source car. They would rather buy a nice new Chevy. http://blog.entchev.com/2010/04/29/open-source-car.aspx
Some people like me, enjoy the satisfaction of baking their own cake. Others just want to buy a cake and eat it. The problem is, employee satisfaction doesn’t stack up well against speed of delivery when there’s wads of cash available to spend on vendor tools. So, OS has to be more efficient in the long run. In some areas it is (e.g. 64 bit PostGIS), in other areas it isn’t (e.g. publishing web services from the desktop)
Someone has wads of cash? I’m a consultant, do tell! 😉
Unlike your professor’s cat, the proprietary software vendor starts their conditioning as early as elementary school. You will be assimilated.
I tried to stay away from making this an Esri discussion. Although their tools are at the core of some of the most convoluted workflows I’ve seen, I know there’s good work going on to produce more modern tools. I think they’ve been hemmed in by some of their own legacy tools as well. Ultimately, users have the power to avail themselves of better tools and methods. They just need to decide to do so.
I get that too. My views are tempered by the fact my whole career (pretty much) has been working for, or with, local & regional government. They are terrified of making a wrong choice about their GIS and go the safe route. Plus they argue that they can save training money because new hires come out of school with minimal training needs. The opportunity lies with GIS folks seeing value in other tools and being able to sell them to finance/IT/electeds/etc and to begin to work them in around the edges. I’ve seen it done.
Also, I’m not sure I said ‘tools’ often enough in my previous reply. 😉