Bobcares

Cloudflare workers postgreSQL | Explained

by | Nov 6, 2022

Today let’s have deeper look at Cloudflare workers postgresql article by our Server management support services at Bobcares.

PostgreSQL from Cloudflare Workers using a database connector

Cloudflare workers postgreSQL

For a quick start, you will initially use Docker to run a local instance of Postgres, and PgBouncer to securely expose the stack to the Internet using Cloudflare Tunnel.

 

To get started:

 
  • First, run the below git command to clone a basic Postgres database connector project.
 
  • After running the git clone command, open into the new project.
 
git clone https://github.com/cloudflare/worker-template-postgres/
$ cd worker-template-postgres
 

​​ Cloudflare Tunnel authentication

 

To create and manage secure Cloudflare Tunnels, you need to authenticate cloudflared CLI. You can skip this step if you have already authenticated cloudflared on your system.

 
docker run -v ~/.cloudflared:/etc/cloudflared cloudflare/cloudflared:2021.11.0 login
 

Running this command will:

 
  • Prompts to select your Cloudflare account and server name.
 
  • Download credentials and allowcloudflared to create Tunnels and DNS records.
 

Initiate and prepare the Postgres database

 

Start the Postgres server

 

You can get access to a prepared docker-compose file that does not require any changes in scripts/postgres with the below services:

 
  • postgres
 
  • pgbouncer : to provide connection pooling that placed in front of Postgres .
 
  • Cloudflare : Allows your applications to connect securely, through an encrypted tunnel. Can be done without opening any local ports.
 

Run the following commands to start all services. Replace postgres-tunnel.example.com with server name on your Cloudflare zone to route traffic through this tunnel.

 
cd scripts/postgres
$ export TUNNEL_HOSTNAME=postgres-tunnel.example.com
docker compose up

# Alternative: Run `docker compose up -D` to start docker-compose detached
 

Thedocker-compose will spin up and configure all the services. This includes the installation of Tunnel’s DNS record. The DNS record will point to the Cloudflare Tunnel which secures the connection between a local instance of cloudflared and the Cloudflare network.

 

​​ Import example dataset

 

Once Postgres is up and running, set the database with a schema and a dataset. Here the Pagila schema and dataset used in this article. Use docker exec to execute a command for running the Postgres container and to import Pagila schema and dataset.

 
curl https://raw.githubusercontent.com/devrimgunduz/pagila/master/pagila-schema.sql | docker exec -i postgres_postgresql_1 psql -U postgres -d postgres
$ curl https://raw.githubusercontent.com/devrimgunduz/pagila/master/pagila-data.sql | docker exec -i postgres_postgresql_1 psql -U postgres -d postgres
 

This will download the SQL schema and dataset files from Pagila’s GitHub repository, and execute them in a local Postgres database instance.

 

​​ Edit Worker and query Pagila dataset

 

​​ Database connection settings

 

In src/index.ts, replace https://dev.example.com with Cloudflare Tunnel servername, ensuring that it is prefixed with the https:// protocol:

 
src/index.ts
const client = new Client({
  user: 'postgres',
  database: 'postgres',
  hostname: 'https://REPLACE_WITH_TUNNEL_HOSTNAME',
  password: '',
  port: 5432,
});
 

At this point, you can deploy Worker, and makes a request to verify that your database connection is working.

 

​​ Query Pagila dataset

 

The template script includes a simple query to select a number (SELECT 42;), that will execute in the database. Edit the script to import the Pagila dataset if the pagila-table query parameter is present.

 
// Query the database.

// Parse the URL, and get the 'pagila-table' query parameter (which may not exist)
const url = new URL(request.url);
const pagilaTable = url.searchParams.get('pagila-table');

let result;
// if pagilaTable is defined, run a query on the Pagila dataset
if (
  [
    'actor',
    'address',
    'category',
    'city',
    'country',
    'customer',
    'film',
    'film_actor',
    'film_category',
    'inventory',
    'language',
    'payment',
    'payment_p2020_01',
    'payment_p2020_02',
    'payment_p2020_03',
    'payment_p2020_04',
    'payment_p2020_05',
    'payment_p2020_06',
    'rental',
    'staff',
    'store',
  ].includes(pagilaTable)
) {
  result = await client.queryObject(`SELECT * FROM ${pagilaTable};`);
} else {
  const param = 42;
  result = await client.queryObject(`SELECT ${param} as answer;`);
}
 

​​ Worker deployment

 

In wrangler.toml, enter your Cloudflare account ID in account_id:

 
wrangler.tomlname = "worker-postgres-template"
type = "javascript"
account_id = ""
 

Publish your function:

 
wrangler publish
Built successfully, built project size is 10 KiB.
Successfully published your script to https://workers-postgres-template.example.workers.dev
 

​​ Set secrets

 

Next, create and save a Client ID. Here are the Client Secret Worker, which secrets in case your Tunnel is protected by Cloudflare Access.

 
wrangler secret put CF_CLIENT_ID
$ wrangler secret put CF_CLIENT_SECRET
 

​​ Test the Worker

 

Request for Pagila tables to add the ?pagila-table parameter with a table name to URL of the Worker.

 
curl https://example.workers.dev/?pagila-table=actor
$ curl https://example.workers.dev/?pagila-table=address
curl https://example.workers.dev/?pagila-table=country
$ curl https://example.workers.dev/?pagila-table=language
 

Cleanup

 

Run the below command to stop and remove the Docker containers and networks:

 
docker compose down

# Stop and remove containers, networks

 

[Looking for a solution to another query? We are just a click away.]

 

Conclusion

To conclude, From this Cloudflare workers PostgreSQL article, you have seen how to fetch data in your Cloudflare Workers applications from a PostgreSQL database using a Postgres database connector.

 

PREVENT YOUR SERVER FROM CRASHING!

Never again lose customers to poor server speed! Let us help you.

Our server experts will monitor & maintain your server 24/7 so that it remains lightning fast and secure.

GET STARTED

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Never again lose customers to poor
server speed! Let us help you.