Cloud Run functions connecting to postgres

By John Keyes

October 9, 2024 at 12:17

gcp gcp postgres supabase

Introduction

The Tapadoo build server API (beta-server-app) calls some GCP Cloud Run functions to perform asynchronous tasks like adding a build record to the database.

This function currently uses a Firestore database, but I don’t believe the object database is a good solution for this data, and has resulted in a lot of data duplication between collections.

As Postgres is the relational database we use most in Tapadoo, I investigated different ways for our JavaScript code in Cloud Run functions to write data to a postgres instance hosted by Supabase.

I used four different methods to do so, and I’ll go through each one of them here.

Secrets

The first thing I needed to resolve was a secure way to store the database credentials. GCP has a Secret Manager product that allows us to do this.

The following screencast shows where to find the Secret Manager, and how to add and delete secrets.

To make secrets available to JavaScript Cloud Run functions we need to import the defineSecret function and then call it for the secrets we are interested in:

const { defineSecret } = require('firebase-functions/params');
const supabaseUrl = defineSecret('SUPABASE_URL');
const supabaseKey = defineSecret('SUPABASE_KEY');

When using these constants in the function definition we must use the runWith function and specify the secrets we’re interested in. Each scenario below includes this function call.

Supabase Library

The first approach uses the Supabase JavaScript library which can be installed by:

npm i @supabase/supabase-js

We can only use supabase-js with Supabase hosted postgres, as it’s a wrapper for the PostgREST API that Supabase expose for their databases. This is why when we create a client for this library we pass the URL and API key. Then we use Supabase’s function based syntax to insert the record.

const { createClient } = require('@supabase/supabase-js');
...
exports.insertSupabase = functions.region('europe-west2').runWith({
    secrets: [supabaseUrl, supabaseKey]
  }).https.onCall(async (data, context) => {
    // note we must call the value method of the secret
    const supabase = createClient(supabaseUrl.value(), supabaseKey.value());
    // insert a record into the Projects table
    let resp = await supabase
      .from('Projects')
      .insert([
        { name: data.name || 'Supabase', platform: 'iOS' }
      ]);
});

To get the result of the insert we can chain a .select():

  ...
    .insert([...])
    .select();

Postgres

The second approach uses a direct connection using Postgres.js:

npm i postgres

For Postgres.js, we use a postgres URL to instantiate the client. This library uses SQL to perform the insert and as it uses template literals we can reference the values using standard ${} syntax.

const postgres = require('postgres');
const supabasePostgresUrl = defineSecret('SUPABASE_POSTGRES_URL');
...
exports.insertPostgres = functions.region('europe-west2').runWith({
    secrets: [supabasePostgresUrl]
  }).https.onCall(async (data, context) => {
    const client = postgres(supabasePostgresUrl.value());
    // insert a record into the Projects table
    const res = await client`
    insert into "Projects" (
      name, platform
    ) values (
      ${data.name || 'Postgres'}, 'iOS'
    )
    returning *
    `
});

pg

Now, let’s look at the third version. This is another direct connection using the pg library:

npm i pg

As pg is also a direct connection, we pass the URL when instantiating the client. pg uses raw SQL, and in this case we have to use query placeholders to avoid SQL injection issues:

exports.insertPg = functions.region('europe-west2').runWith({
    secrets: [supabasePostgresUrl]
  }).https.onCall(async (data, context) => {
    const query = 'INSERT INTO "Projects"(name, platform) VALUES($1, $2) RETURNING *'
    const values = [data.name || 'PG', 'iOS']
    const connectionUrl = supabasePostgresUrl.value();
    const { Client } = pg
    const client = new Client(connectionUrl);
    await client.connect();
    const res = await client.query(query, values)
    await client.end();
});

Drizzle

And to take a completely different approach I then tried Drizzle, an ORM:

npm i drizzle-orm

In this case we create a Drizzle Table to represent the Projects database table. To insert the data we then call insert for that table:

exports.insertDrizzle = functions.region('europe-west2').runWith({
    secrets: [supabasePostgresUrl]
  }).https.onCall(async (data, context) => {
    const client = postgres(supabasePostgresUrl.value());
    const db = drizzle(client);
    const projects = pgTable('Projects', {
      name: text('name').notNull(),
      platform: text('platform').notNull(),
    });
    const result = await db.insert(projects).values({
      name: data.name || "Drizzle",
      platform: 'iOS',
    }).returning();
    await client.end();
});

Using an ORM has pros and cons. In this case it’s not very useful, but if the table had been predefined then it could be a lower-cost to entry for someone unfamiliar with SQL.

Conclusion

There isn’t much to conclude here, this was simply to showcase some of the different approaches that could be taken. If you want to go the direct SQL way, there is little to choose between pg and postgres. However, if you are going to host your entire application in Supabase you could sacrifice portability and go with supabase-js.

Last updated: October 9, 2024 at 12:17