geoMusings

geospatial technologies and practices

Lock-In

| Comments

I’ve been a consultant/programmer/integrator/other for over twenty years now. That’s not quite long enough to say I’ve seen it all but long enough to notice a few patterns. Admittedly, I’ve spent the vast majority of that time working in the defense world so the patterns may be heavily skewed to that but I think not.

I’ve run across a number of well-entrenched government-developed systems, such as command-and-control systems, with user interfaces and experiences that many professional designers would consider abhorrent. Yet, I have watched smart, motivated men and women in uniform stand in front of working groups and committees dedicated to “improving” systems and workflows and advocate passionately for these seemingly clunky systems.

Why? Because they know how to use these systems inside and out to meet their missions. User experience is ultimately about comfort and confidence. A user that is comfortable with a system will have a great experience with it regardless of its appearance. DOD tackles this reality through training. For all its faults, there is still no organization better at developing procedures and thoroughly training its people in them. It results in a passionate loyalty for the tools that help them do their jobs and places a very high hurdle in front of any tools that seek to replace current ones.

The Esri UC So Far #EsriUC

| Comments

So I’m halfway through the largest geospatial event of the year, attending it for the first time in four years, and I haven’t blogged yet. As always, it’s a busy week. Because this event draws people from all over the country (and world), my dance card fills up pretty quickly. And, by the way, there’s a conference going on.

This is the first I’ve ever attended the Esri User Conference as just an attendee. If it were a video game, I’d be playing it on the easy level. I sat through the entire plenary for the first time in years. It was nice table setting for the rest of the week. As the father of a dancer, I have developed an eye for choreography and there is plenty of it up on the plenary stage. If I were to level one piece of constructive criticism toward the UC, it’s that I’d let speakers be themselves a little bit more. That said, the content was delivered smoothly, which is really the larger point.

Gearing Up for the Esri UC

| Comments

With a house move behind us and a lot of unpacking and other tasks ahead, I am nonetheless getting ready to head out to the Esri International User Conference next week. This will be my first time attending since 2010 and is the first UC since then that has aligned with my schedule in a way that I can make it. Of course, the price is right this year as well ($0.00).

The “big” UC has steadily dropped in significance for me over years as it has become much easier to get Esri-related information through various other channels; primarily through social media and local/regional events such as the FedUC and local dev meetups. The last few trips to San Diego left me feeling that the content presented there is getting increasingly superficial compared to the other events. This year, however, I have been in the midst of building a house and moving so I have not had the time to attend the smaller events. As a result the UC makes sense. I am hoping the recent trend reverses itself.

I still do some Esri-based consulting so it’s important to stay current however I can. My government customers are starting to at least ask about ArcGIS Online, so I want to finally get my mind around it as best I can. The messaging around that platform has been so muddled that it’s still difficult for me to articulate what productivity advantages, if any, it actually offers. My own experimentation with it has left me wanting. The fact that it has been certified as FISMA compliant will certainly raise its profile with some of my Federal customers, though that’s typically only the first step in a very long process. I’m also curious about ArcGIS Pro.

Unlike previous years, I won’t be manning a booth (we sold ours a few years ago) or directly representing a customer so I’ll actually have the luxury to attend sessions and meet up with people. There are some people who, unfortunately, I really only see face-to-face at such large magnet events so I’m looking forward to catching up with them, as well as meeting new people. I will, however, be popping into the paper session of one of my co-workers. (It would be awesome if the online agenda provided permalinks to individual sessions.)

So, if you’re going to be in at the UC next week and would like to meet up, feel free to ping me on Twitter (@billdollins), via e-mail (bill [at] geomusings [dot] com), or drop a comment below.

CFPB Fellowship Seeking 2015 Candidates

| Comments

It’s no secret that I am contractor who spends a lot of my time attempting to develop software for defense users. I’ve been doing this for a long time, though I have added other customers to my portfolio over the years. The process of development in this arena gets more frustrating by the day. Recently, for example, a group policy update was pushed that removed any browser other than Internet Explorer from our development machines and rolled Internet Explorer back to version 9. These are just the latest such setbacks to productivity and they represent every stereotype we’ve ever heard about Federal Government computing.

Thankfully, there are countervailing examples which point to how things could be. One such example is the Technology and Innovation Fellowship at the Consumer Financial Protection Bureau (CFPB). This two-year fellowship program provides an opportunity to show how Federal software can be developed, with an open-source-first approach, and also how software development can occur, via remote teams and distributed collaboration. These are not new concepts in the overall marketplace but are still fairly exotic in the Beltway region. Ultimately, the fellowship holds out the possibility of building technology that actually helps government work better and shows how modern tools and working arrangements be applied in the Federal Government.

I was clued into this fellowship program via a tweet by Mike Byrne, who has already helped show the way via the National Broadband Map at the FCC, and who is now at the CFPB. There are very few people I’ve met in the Federal Government who have a better vision for modernizing IT development and acquisition, coupled with the ability to get things done. If you’re of a technical bent and looking to work inside the Federal Government, this fellowship program may be something you want to check out.

Where Ya Been?

| Comments

It’s been rather quiet on the blog for a while. Sometimes the posts have to take a back seat to work and other things. This time of year tends to be busy anyway due to the end of the school year and its related activities, but this year has also included one move, construction of a house, and preparations for a second (final) move. In December we sold our house, which I had lived in for nearly 40 years, and moved into temporary quarters while the next house was being built. The sale of the old place was a pretty smooth experience as all of us, especially me, were ready for a change.

As a result, the experimentation and small projects which have driven the content of this blog since it started simply had to stop for a while. That’s not to say that there has been no activity. I have posted over the last few months related to some mapping work and the “software exhaust” that has resulted from it. It’s not really been possible, however, to sit down a create a well-structured discussion of those activies in the way that I would prefer, so I simply haven’t.

Personal Geospatial Workflows, May 2014 Edition

| Comments

I have been spending the past few weeks dealing more with data and mapping than I have in quite a while. It’s given me a chance to regain my footing with map-making, reconnect with some end-user tools like Arc2Earth, and build a little more proficiency with things like GDAL, QGIS, and TileMill. Of course, I’ve been able to sneak in some coding as I’ve identified gaps in my workflow.

In a nutshell, I am building base maps for use on disconnected mobile devices. There are two styles of base maps; imagery (really more of an imagery/vector hybrid) and a high-contrast map for use on the outdoor devices and sourced only from vector data. In both cases, I am building MBTiles databases to support the requirement for disconnected operations and to provide consistency in data size and performance.

For the imagery base maps, I was faced with following a data request process that may or may not have resulted in getting imagery in a timely fashion. Alternatively, I was presented with the option of using a tiled map service to get the imagery. Given that I was just making basemaps, this would have been acceptable but for the spotty speed and reliability of the network connection. The ideal solution would be to get only the tiles I need, store them locally, create a geo-referenced image from them, and build a virtual raster table (VRT) for each level.

Using Virtual Rasters to Generate Contours in QGIS

| Comments

Every now and again, I am asked to make maps. It’s not my strongest suit, but it sometimes comes with the territory. My latest task, as mentioned in my previous post, involves building support for MBTiles databases into a mobile situational awareness tool. This is done so that the devices can have a persistent local basemap in the field. The need arose to ensure that the basemaps were high contrast to assist with visibility in bright sunlight. Something like this:

One of the requirements was to have topographic-map-like contours to indicate changes in elevation. Existing map services didn’t provide what we needed so it was necessary to build a custom map, which meant generating contour lines. It had been years since I had last done that with Esri tools, but I didn’t have any extension licenses available, so I turned to QGIS to get the job done this time.

Data, Apps, and Maps

| Comments

It’s been a quiet month-and-a-half here on the blog, mostly owing to an abundance of project tasks. I recently started a short-term project to help one of my Federal customers extend data source support for an application they have been developing. This customer is technically a new one but the project team is made up of government developers that I have worked with on a few other projects so there is a great deal of familiarity.

The application, which has been under development for some time, is written in .Net and make use of the open-source (MIT) GMap.NET mapping library. The application features a desktop version running in Windows and a mobile version running on Android tablets. The .Net back end works seamlessly on both through the use of Xamarin, although I have not had the chance to get my hands dirty with that yet due to limits on Xamarin licenses and available Android devices. To its credit, GMap.NET seems to work fairly well in both environments.

A Little Deeper With Node and PostGIS

| Comments

What does one do when presented with more snow days than expected? My friends in Colorado would probably do something outrageous like skiing, but I found it to be a great opportunity to catch up on some of my recreational coding. Specifically, I wanted to revisit the Node/PostGIS work I blogged about earlier.

As fun as that project was, it was fairly limited in scope and I wanted to spread my wings a little more with Node. So I decided to build a more general-purpose wrapper around PostGIS. Lately, I’ve become a bit obsessed with the idea that PostGIS may be the only GIS tool you really need in terms of data storage, management, and analytics. That’s probably a topic for another post but exploring that concept was a perfect premise for my continued explorations with Node.

I have been circling back to Node over the past few months to continue building my comfort level with it. I tend to eschew frameworks when i have learning something new because I want to get comfortable with the core before I start layering on abstraction. That was my approach with the tile viewer tool I built a while back. For the recent post centered on Amazon RDS, I added Express into the mix, which has been a big help.

This time around, I wanted to dig a little deeper with the node-postgres module and also make the application more modular. I wanted to build around a few core principles:

  1. Keep it RESTful (or as close to it as I could)
  2. GeoJSON in/GeoJSON out (so….vector only for now)
  3. Let PostGIS do the heavy lifting