Recently, I’ve gotten back in touch with .Net in the form of .Net Core. I’ve been shaking off some the coding rust and building some tools to help with data handling related to the Foresight data service at Spatial Networks. It’s been fun to get my hands dirty again and also interesting to see how .Net has evolved over the past few years.
It’s been a few years since I’ve done a lot with .Net and, after spending some time in the Node ecosystem, this was my first foray into .Net Core. The application I was working on just wasn’t coming together correctly in Node, so I started prototyping out the logic flow in .Net Core, with the intent to port it back to Node when I had a good reference implementation. The more I kept using .Net, the more impressed I got, so I just kept the application there.
James Fee and I released the next episode of our podcast this week. This month, we are taking a closer look at PostGIS and how you can get started with it. We’re both longtime users and huge fans of PostGIS, so it was fun to dig into it a little.
You can check it out here, on Google Play, iTunes, Spotify, or wherever you listen to podcasts.
It seems that I tend to revisit the state of desktop GIS every so often. With the continued advancement of “web GIS,” as well as the increased power of mobile platforms, proliferation of spatial analysis techniques into non-traditional environments, the ubiquity of spatial databases, and a host of other factors, it’s tempting to speculate on the long-term prospects of traditional desktop GIS software. This seems especially true when the software in question originates in Redlands, California.
I was brought back to this topic by a recent discussion on Twitter, initiated by my friend, Atanas Entchev.
The ensuing discussion grew legs and continued much longer than I would have thought. The core of the discussion centered around confusion in Esri’s messaging or, more accurately, subsequent interpretation of Esri’s messaging with regard to the status of ArcGIS Desktop. Long story short: much ado about nothing. Esri is releasing new versions of ArcGIS Pro and ArcMap. There are primary sources reaffirming their commitment to desktop GIS, so we can all go back to what we were doing. Awesome.
Where I work, we have developed a nuanced philosophy to describe the niceties of collecting data, managing it, validating it, and preparing it for use: “Data is hard.”
This was brought to light in a very public manner by the vandalism that was displayed on basemaps produced by Mapbox. The responses by Mapbox and their CEO, Eric Gendersen, are good examples of how a company should respond to such incidents. Kudos to him and the team at Mapbox for addressing and rectifying the situation quickly.
Speculation quickly ran to vandalism of OSM, which is one of the primary data sources used by Mapbox in their products. That speculation was backed up by the edit history in the New York area, but it is interesting to note that the vandalism was caught early in OSM and never came to light is OSM itself. In this case, the crowd worked as it was supposed to.
In March of 2017, I gave the keynote address at TUgis, Maryland’s geospatial conference. Despite a few requests, I’ve been remiss about posting the text until now. The following lengthy block quote is the address as I had written it. It differs slightly from what I actually said, but I can no longer recall the on-the-fly changes I made. The original text is pretty close, though. It was written before some of the announcements by Microsoft and Amazon later in the year, but that simply serves to illustrate the pace with which trends discussed here are moving.
There was a time in my consulting career where I was providing GIS software and database support to the US federal critical infrastructure protection community. Part of that work involved ‘event response,’ which most often took the form of natural disasters. I never deployed, but a lot of my co-workers did.
Ground truth was always the biggest problem. We were always trying to get a sense of what conditions were like on the ground with as little latency as we could manage. With the technology of the 2003 – 2007 time frame, that was a significant challenge. Whether notepads or spreadsheets or custom data collection extensions deployed on ToughBooks, we tried just about everything we could think of. Some of my co-workers even managed a forward-deployed ArcIMS server to try to get anything useful out of the affected areas after Katrina.
Fast-forward to 2017 and we’re dealing with the unprecedented aftermath of Harvey in the Houston, Texas area. I find myself in the fortunate position of working for a company, Spatial Networks, that has a technology I wish we had back then. In addition to the technology, the company has the will to open it up and put it in the hands of whomever needs it.