I’ve been integrating pg_webhooks into some project work lately. Conversely, I have not been using n8n much, but a vendor rep used it in a demo of some workflow automation capabilities for a platform one of my customers is using. I had dabbled with it about two years ago but haven’t done much with it since.
One thing that I have been lax about is building some practical examples of how to use pg_webhooks, and I thought n8n could be the basis of such an example. It is a platform in the IPaaS (Integration Platform as a Service) class of tools. It is built with Node and Javascript, which I like a lot, and has the option for self-hosting, which I like even more. The application was re-licensed from Apache 2.0 to a “Sustainable Use License” in March of 2022. Such re-licensing has become a trend – not a good one but a trend nonetheless. The fact that this post uses n8n in its example is not an explicit endorsement of n8n or its license.
With that bit of housekeeping out of the way, let’s proceed with the example. Here, I use a self-hosted version of pg_webhooks connected to my local PostgreSQL server. It will be communicating with a paid, cloud-hosted version of n8n. We’ll start with setting up pg_webhooks.
1. pg_webhooks Setup
The easiest way to get going is to clone the pg_webhooks repository from GitHub. After that, we need to install its dependencies with npm install
(or use yarn if that’s your thing).
Next, we’ll create the subscriptions
table in the public
schema of our desired database. As of this writing, pg_webhooks requires the subscriptions
table to be in public
and it must also reside in the database from which notifications will be raised. So, if we plan to raise notifications from our test
database then the subscriptions
table must reside in the public
schema of the test
database.
The repo contains a SQL script that does all of the table setup for us. We simply run that script in our database using the PostgreSQL client of our choosing.
Now that the table is created, we’ll need to create a .env
file to hold our PostgreSQL connection information. This is suitable for development, but it is strongly recommended to use system-level environment variables in production. The repo contains a file called "example.env"
. Simply copy that to a file named ".env"
and edit it to reflect our information.
The PGDATABASE
variable should contain the name of the database where we created the subscriptions
table. This post won’t discuss MAXFAILS
so there’s no need to edit it.
With these tasks done, we should be able to run pg_webhooks by issuing the command npm start
. Assuming this was successful, we’ll move on to the next step, creating a webhook receiver in n8n and registering it with pg_webhooks.
2. Webhook Configuration
This step has three parts: 1) creating a webhook trigger in n8n, 2) registering its URL as a subscription in pg_webhooks, and 3) doing a test event to capture the payload schema in n8n for use downstream in our workspace.
This post is not an n8n tutorial, so it assumes that we have n8n running (either self-hosted or SaaS) and are able to access the workspace editor. If we are self-hosting, we may need to do some extra network configuration to set up tunneling or a subdomain. Again, this post assumes we have successfully done all of that.
Creating the Webhook Trigger
This workflow in n8n will be triggered by a webhook call. To set that up, we add a webhook trigger to the canvas. Use the “plus” icon to access the list of available nodes.
Use the search tool and, when we see the “Webhook” node available, choose it to add it to the editor canvas.
We can then double click the node in the editor to open its property sheet to complete this part of the setup.
We’ll see that n8n generates both test and production URLs for us. We’ll use the test URL for the rest of this post. In the property sheet, we need make sure the “HTTP Method” property is set to POST as pg_webhooks only supports the POST method. All other properties can stay the same. Next, we’ll need to click on the URL to copy it to the clipboard. Now we’ll register the webhook with pg_webhooks.
Register the Webhook
Assuming that we were previously able to successfully run the pg_webhooks and that it is currently running, we will use its /subscriptions/add
endpoint to register the n8n test URL. pg_webhooks does not provide a UI for this, so we’ll need to use something like curl or Postman. Here is a screenshot of Postman showing the proper URL and payload configurations.
The JSON payload contains the information needed to subscribe. The “channel” property is the name of the notifications channel in PostgreSQL and the callback is the test URL from n8n. When we send the payload via POST, we’ll receive a 200 response code and a JSON payload with a subscription ID that we can store as needed. In the future, pg_webhooks will send this ID with all payloads to help with validation of inbound calls.
At this point, the n8n test URL is registered with pg_webhooks. The final step is a test event so that n8n can retrieve the payload schema.
Firing a Test Event
Back in the webhook property page in the n8n editor, we will see a button labeled “Listen for Test Event”. We need to do this in order to capture a sample payload from the webhook for use in downstream n8n nodes. Click the button and n8n will listen for a webhook event.
Once we do this, and with pg_webhooks running, we’ll need to trigger an event from within your database. The easiest way to do this is to use the pg_notify
function from the PostgreSQL client of our choice. The name of the channel should match the name subscribed to in step 1. In this example, we used ‘dev’ so that should be the first function parameter. The second parameter is a sample payload. Here, we are using a simple string, but we will want a full example of our actual payload in order to properly configure downstream actions. Here is an example how to use the pg_notify
function.
select pg_notify('dev', 'this is a test');
Assuming all of the previous configuration steps have been done correctly, the test event will fire in n8n and we will see a sample of response.
With this sample payload, we can now configure downstream actions.
3. Doing Something
It’s time to actually do something! In this post, that something will be really simple: we’re going to write the timestamp of the event and the raw payload body to a Google Sheet. To do this, we’ll add a Google Sheets node in the same way we added the webhook trigger above. We’ll then double-click it to configure it in its property page.
We’ll configure our Google authentication as we see fit. For this example we are appending a row to a sheet (“Sheet1”) in an existing workbook (“n8n test”). The sheet has two columns, date
and message
which are mapped manually here. Each is populated using an expression. The date
column gets the current date/time. The message
column gets the value of the “body” of the webhook response. We can only configure this because we fired the test event in the previous step.
Once this configuration is complete, we can close the property page and choose “Execute Workflow” in the n8n canvas.
This will cause the workflow to listen for events. Now we can issue the same test SQL statement we did in the previous step. If the connection is successful, then pg_webhooks will receive a JSON response from n8n indicating the workflow has started.
This particular workflow is simple and will execute quickly. If we go look at our Google Sheet, we will see a new row.
4. Wrapping Up
The preceding example shows the basics of using pg_webhooks with an IPaaS like n8n. n8n support numerous systems and operations, so we can design very robust workflows. Where we are simply writing data to a Sheet, we could instead write to a data warehouse such as BigQuery. Where we currently have a simple string for a payload, we could have a complex payload such as a GeoJSON feature collection. From there we can perform sophisticated calculations or transformations before send data to its destination. My personal preference is to keep such workflows as simple as possible but, regardless of the complexity of what you do inside n8n, it begins with a webhook trigger from pg_webhooks, so be sure to take care in how you design your payloads and your notification logic.