Integrating Stripe with BigQuery

One of the projects that I mentioned in my post a couple of weeks ago was the migration of our billing system to Stripe. Stripe is widely used for billing on the internet, in both SaaS and non-SaaS use cases. A while back, I wrote about the general limitations of IPaaS platforms in terms of flexibility and Stripe exposes a lot of these.

One particular product did not expose all of the object type we needed to extract from Stripe. Another simply did not sync all of the object types it claimed to be syncing. A third had a clear bug in which it wrote the current date/time into all date fields. In each of these cases, we were left to file support tickets and wait. I moved on.

Read more

BigQuery and Koop

As I continued my experimentation with BigQuery, I found myself wanting to more easily use it with my regular GIS tool set. BigQuery has a lot of powerful analytic capability, but the SQL console is intimidating for the casual user and the GeoViewer tool is fairly limited. As I began digging deeper in my previous … Read more

Routing with BigQuery and ArcGIS Platform APIs

This post is a continuation of last month’s post about analyzing location change with BigQuery. At the end of that post, I was already thinking of ways to extend the analysis and visualization. I decided to take the opportunity to explore Esri’s recently-announced ArcGIS Platform APIs. These APIs are the same that have been available via an AGOL subscription or an ELA, but they are now presented in a consumption-based model, similar to Google or Mapbox APIs, that allow you to make use of them without having to make a larger up-front commitment to the rest of the ArcGIS stack. Esri’s basemaps and their location services have always been high-quality, so it’s nice to see them available under a more accessible model.

Decided to use the Esri routing API to visualize possible routes between the various locations of the “Sample Asset” from my last post. I chose to build a very simple Node API to access the BigQuery data and use that API from a simple page that calls the Esri API and displays the output on a Leaflet map. The first thing I needed to do was add a little extra SQL in BigQuery to return coordinates in a format consumable by the Esri API. The raw API expects coordinates delimited as such:

-76.519989228,38.303696474000006;-76.695443826137989,38.376038894414251;-76.635015354340467,38.29745667728772;-76.519989228,38.303696474000006;-76.695443826137989,38.376038894414251;-76.635015354340467,38.29745667728772;-76.519989228,38.303696474000006;-76.495959193,38.236694621

Read more

Analyzing Location Change Over Time in BigQuery

I’ve recently spent a lot of time doing various forms of business analytics in BigQuery. As discussed in a previous post, I’ve been using BigQuery as the data integration environment for several business systems. I’ve found integration at the data level via an ETL/ELT/IPaaS pipeline to be a lot more stable than system-level integrations that involve chaining together dependencies on fairly volatile SaaS APIs.

The original premise was fairly straightforward: Given a table of user-level statistics over time, identify only those points in time where one or more of the statistics changed value. In our case, we had several million rows of user-level data captured on a daily cadence. Manually inspecting this data for changes in individual values by customer was simply not a viable plan. The BigQuery LAG function came to the rescue.

Read more

FME, Salesforce, and BigQuery

More often that not in my current role, opportunities to get my hands dirty come from the data side of our operation rather than the engineering side. These days, the data side involves corporate data rather than a lot of geospatial data. If I were to be guided by my own personal inertia, I’d drift toward traditional geospatial data 99% of the time, but working with other data stores and building pipelines involving them is good exposure.

Most recently, I’ve been working a lot with Salesforce and other data sources to support customer success operations. Customer success, as a discipline, is relatively new, having grown out of the SaaS market from the best practices involved in nurturing and growing customers post-sale as part of the SaaS land-and-expand model.

SaaS typically begets SaaS – meaning that you won’t often find SaaS companies using on-prem versions of business tools. This presents interesting challenges for data integration and analytics. Fifteen years ago, there’d most likely be a server room with installs of various vertical systems that had been configured to use whatever the organization’s blessed database platform was. In the event that an individual system didn’t support that database, there might be some nominal ETL performing a one-way sync so that the necessary charts and graphs could be made as needed.

Read more