Routing with BigQuery and ArcGIS Platform APIs

This post is a continuation of last month’s post about analyzing location change with BigQuery. At the end of that post, I was already thinking of ways to extend the analysis and visualization. I decided to take the opportunity to explore Esri’s recently-announced ArcGIS Platform APIs. These APIs are the same that have been available via an AGOL subscription or an ELA, but they are now presented in a consumption-based model, similar to Google or Mapbox APIs, that allow you to make use of them without having to make a larger up-front commitment to the rest of the ArcGIS stack. Esri’s basemaps and their location services have always been high-quality, so it’s nice to see them available under a more accessible model.

Decided to use the Esri routing API to visualize possible routes between the various locations of the “Sample Asset” from my last post. I chose to build a very simple Node API to access the BigQuery data and use that API from a simple page that calls the Esri API and displays the output on a Leaflet map. The first thing I needed to do was add a little extra SQL in BigQuery to return coordinates in a format consumable by the Esri API. The raw API expects coordinates delimited as such:


This syntax happens to be the same as that used by OSRM, which will be the subject of a future post. I wrapped the query from the previous post in subquery and extracted the delimited coordinates like this (post continues after the code block):

ARRAY_TO_STRING(ARRAY_AGG(CONCAT(st_x(q.loc),',',st_y(q.loc))), ';') AS coords
(LAG(capture_date) OVER (PARTITION BY resource_id ORDER BY capture_date ASC)) AS prev_date,
(LAG(wkt) OVER (PARTITION BY resource_id ORDER BY capture_date ASC)) AS prev_loc
st_astext(loc) AS wkt,
capture_date DESC) q
capture_date )
'Sample Asset' AS name,
capture_date ASC) q
view raw vw_stops.sql hosted with ❤ by GitHub

I saved that query as a view to make life a little easier in my application. On the Node side of things, I used the nodejs-bigquery client library. Pay special attention to the pre-requisite setup steps in the quickstart section of the readme to ensure BigQuery is setup to work with your app. The samples in the readme are great for getting started as well.

I used Express to make a super-simple API. I won’t go into the details of that here because that’s a well-worn path and a search engine will surface better explanations than I could provide. Within the app, there’s an important step to setting up interaction with BigQuery. BigQuery needs to know which project you are accessing and what credentials you are using. It supports a number of authentication methods, but I’m using a service account key file to keep things simple for this post. Both of those things need to exposed via environment variables, which can be done like so:

//Google Cloud Environments
process.env.GOOGLE_CLOUD_PROJECT = "my-project-name";
process.env.GOOGLE_APPLICATION_CREDENTIALS = "path/to/my/key.json";

The BigQuery Node library uses async calls for just about everything. Here’s the function to query the view above and get the coordinates for the stops in the required syntax.

async function query() {
    // Queries the view to get stops along the route.

    const query = `SELECT coords FROM \`my_dataset.vw_stops\` LIMIT 1`;

    // For all options, see
    const options = {
      query: query,
      // Location must match that of the dataset(s) referenced in the query.
      location: 'US',

    // Run the query as a job
    const [job] = await bigquery.createQueryJob(options);
    console.log(`Job ${} started.`);

    // Wait for the query to finish
    const [rows] = await job.getQueryResults();

    // return the results
    return rows[0].coords;

That’s called via an Express API endpoint like so:

app.get('/stops', async (req,res)=> {
    res.send(await query());

So a simple URL call like https://localhost/stops will return the formatted coordinate string. This is ready for client-side use. I modified Esri’s sample routing app to make this one (which is the intent of the sample), but I’ll skip all of the Leaflet scaffolding and just focus on the parts that do the lifting. Just assume that you need to do some setup to make a pretty map and that you’ll need an Esri API key. In client-side script, this makes the call to the Express API defined above:

    //local API call to BigQuery
    const fetchStops = async () => {
      //in real life, there should be exception handling
      const response = await fetch('/stops');
      const retval = await response.text();
      let stopVals = retval;

Assuming all goes well, the call goes all the way through to BigQuery and the stops gets returned into the client. Because it’s fun to work without a net, the result gets immediately sent to the routing function, updateRoute.

    function updateRoute(stopVals) {
      // Create the arcgis-rest-js authentication object to use later.
      const authentication = new arcgisRest.ApiKey({
        key: apiKey

      // make the API request
          stops: processStops(stopVals),
          endpoint: "",
        .then((response) => {
          // Show the result route on the map.

          // Show the result text directions on the map.
          const directionsHTML = response.directions[0] => f.attributes.text).join("<br/>");
          directions.innerHTML = directionsHTML;
        .catch((error) => {
          alert("There was a problem using the route service. See the console for details.");

The “processStops” call is a helper function to further format the stops for use by the Esri client-side API. The function above does the work of calling the API, putting the route on the map, and displaying the directions. The output can be seen in the header image of this post.

It was fairly easy to integrate BigQuery with an ArcGIS Platform API, which opens up possibilities for more flexible analysis and visualization than staying entirely in the Google ecosystem. Next up, I’ll give it a go with with some open-source offerings, like OSRM.

The code for this post can be found here: