It seems that I tend to revisit the state of desktop GIS every so often. With the continued advancement of “web GIS,” as well as the increased power of mobile platforms, proliferation of spatial analysis techniques into non-traditional environments, the ubiquity of spatial databases, and a host of other factors, it’s tempting to speculate on the long-term prospects of traditional desktop GIS software. This seems especially true when the software in question originates in Redlands, California.
I was brought back to this topic by a recent discussion on Twitter, initiated by my friend, Atanas Entchev.
The ensuing discussion grew legs and continued much longer than I would have thought. The core of the discussion centered around confusion in Esri’s messaging or, more accurately, subsequent interpretation of Esri’s messaging with regard to the status of ArcGIS Desktop. Long story short: much ado about nothing. Esri is releasing new versions of ArcGIS Pro and ArcMap. There are primary sources reaffirming their commitment to desktop GIS, so we can all go back to what we were doing. Awesome.
Where I work, we have developed a nuanced philosophy to describe the niceties of collecting data, managing it, validating it, and preparing it for use: “Data is hard.”
This was brought to light in a very public manner by the vandalism that was displayed on basemaps produced by Mapbox. The responses by Mapbox and their CEO, Eric Gendersen, are good examples of how a company should respond to such incidents. Kudos to him and the team at Mapbox for addressing and rectifying the situation quickly.
Speculation quickly ran to vandalism of OSM, which is one of the primary data sources used by Mapbox in their products. That speculation was backed up by the edit history in the New York area, but it is interesting to note that the vandalism was caught early in OSM and never came to light is OSM itself. In this case, the crowd worked as it was supposed to.
In March of 2017, I gave the keynote address at TUgis, Maryland’s geospatial conference. Despite a few requests, I’ve been remiss about posting the text until now. The following lengthy block quote is the address as I had written it. It differs slightly from what I actually said, but I can no longer recall the on-the-fly changes I made. The original text is pretty close, though. It was written before some of the announcements by Microsoft and Amazon later in the year, but that simply serves to illustrate the pace with which trends discussed here are moving.
There was a time in my consulting career where I was providing GIS software and database support to the US federal critical infrastructure protection community. Part of that work involved ‘event response,’ which most often took the form of natural disasters. I never deployed, but a lot of my co-workers did.
Ground truth was always the biggest problem. We were always trying to get a sense of what conditions were like on the ground with as little latency as we could manage. With the technology of the 2003 – 2007 time frame, that was a significant challenge. Whether notepads or spreadsheets or custom data collection extensions deployed on ToughBooks, we tried just about everything we could think of. Some of my co-workers even managed a forward-deployed ArcIMS server to try to get anything useful out of the affected areas after Katrina.
Fast-forward to 2017 and we’re dealing with the unprecedented aftermath of Harvey in the Houston, Texas area. I find myself in the fortunate position of working for a company, Spatial Networks, that has a technology I wish we had back then. In addition to the technology, the company has the will to open it up and put it in the hands of whomever needs it.
It’s been almost two weeks since I returned from FOSS4G 2017 in Boston, Massachusetts (USA), and I wanted to take a little time to regroup and get caught up before settling down to write about it.
It was a busy week, highlighted for me and the Spatial Networks team by the first-ever Fulcrum Live event. Held on the second day of workshops, it was our first user conference for Fulcrum, the mobile data collection platform by Spatial Networks. The event went off without issue, so we are very happy about that.
I am following up my previous post with an extremely simple example using FME to kick off the refresh of a materialized view (matview) after a data import. I had never used FME prior to coming to Spatial Networks, but now I’m hooked. I’m having a really hard time finding things it can’t do.
As I mentioned in my last post, it’s really easy to refresh a matview in PostgreSQL using the REFRESH MATERIALIZED VIEW statement. This leaves open the possibility of automating the refresh as appropriate in an application or other process.
I decided to illustrate this using a basic FME example. Using the cellular tower data set from my past post, I extracted a table containing only the records for the state of Maryland. The towers data set contains the two letter abbreviation for the state, but not the full state name. So, I built a matview to join the state name to a subset of columns from the towers data set. The SQL for that matview is here:
I will use FME to append the records for the state of Virginia from a GeoJSON file to the PostGIS table containing the records for Maryland.