Bobcares

Cloudflare workers postgreSQL | Explained

PDF Header PDF Footer

Today let’s have deeper look at Cloudflare workers postgresql article by our Server management support services at Bobcares.

PostgreSQL from Cloudflare Workers using a database connector

Cloudflare workers postgreSQL

For a quick start, you will initially use Docker to run a local instance of Postgres, and PgBouncer to securely expose the stack to the Internet using Cloudflare Tunnel.

 

To get started:

 
  • First, run the below
    gitCopy Code
    command to clone a basic Postgres database connector project.
 
  • After running the
    git cloneCopy Code
    command, open into the new project.
 
git clone https://github.com/cloudflare/worker-template-postgres/
$ cd worker-template-postgresCopy Code
 

​​ Cloudflare Tunnel authentication

 

To create and manage secure Cloudflare Tunnels, you need to authenticate

cloudflaredCopy Code
CLI. You can skip this step if you have already authenticated
cloudflaredCopy Code
on your system.

 
docker run -v ~/.cloudflared:/etc/cloudflared cloudflare/cloudflared:2021.11.0 loginCopy Code
 

Running this command will:

 
  • Prompts to select your Cloudflare account and server name.
 
  • Download credentials and allow
    cloudflaredCopy Code
    to create Tunnels and DNS records.
 

Initiate and prepare the Postgres database

 

Start the Postgres server

 

You can get access to a prepared

docker-composeCopy Code
file that does not require any changes in
scripts/postgresCopy Code
with the below services:

 
  • postgres
 
  • pgbouncer : to provide connection pooling that placed in front of Postgres .
 
  • Cloudflare : Allows your applications to connect securely, through an encrypted tunnel. Can be done without opening any local ports.
 

Run the following commands to start all services. Replace

postgres-tunnel.example.comCopy Code
with server name on your Cloudflare zone to route traffic through this tunnel.

 
cd scripts/postgres
$ export TUNNEL_HOSTNAME=postgres-tunnel.example.com
docker compose up

# Alternative: Run `docker compose up -D` to start docker-compose detachedCopy Code
 

The

docker-composeCopy Code
will spin up and configure all the services. This includes the installation of Tunnel’s DNS record. The DNS record will point to the Cloudflare Tunnel which secures the connection between a local instance of
cloudflaredCopy Code
and the Cloudflare network.

 

​​ Import example dataset

 

Once Postgres is up and running, set the database with a schema and a dataset. Here the Pagila schema and dataset used in this article. Use

docker execCopy Code
to execute a command for running the Postgres container and to import Pagila schema and dataset.

 
curl https://raw.githubusercontent.com/devrimgunduz/pagila/master/pagila-schema.sql | docker exec -i postgres_postgresql_1 psql -U postgres -d postgres
$ curl https://raw.githubusercontent.com/devrimgunduz/pagila/master/pagila-data.sql | docker exec -i postgres_postgresql_1 psql -U postgres -d postgres
Copy Code
 

This will download the SQL schema and dataset files from Pagila’s GitHub repository, and execute them in a local Postgres database instance.

 

​​ Edit Worker and query Pagila dataset

 

​​ Database connection settings

 

In

src/index.tsCopy Code
, replace
https://dev.example.comCopy Code
with Cloudflare Tunnel servername, ensuring that it is prefixed with the
https://Copy Code
protocol:

 
src/index.ts
const client = new Client({
  user: 'postgres',
  database: 'postgres',
  hostname: 'https://REPLACE_WITH_TUNNEL_HOSTNAME',
  password: '',
  port: 5432,
});
Copy Code
 

At this point, you can deploy Worker, and makes a request to verify that your database connection is working.

 

​​ Query Pagila dataset

 

The template script includes a simple query to select a number (

SELECT 42;Copy Code
), that will execute in the database. Edit the script to import the Pagila dataset if the
pagila-tableCopy Code
query parameter is present.

 
// Query the database.

// Parse the URL, and get the 'pagila-table' query parameter (which may not exist)
const url = new URL(request.url);
const pagilaTable = url.searchParams.get('pagila-table');

let result;
// if pagilaTable is defined, run a query on the Pagila dataset
if (
  [
    'actor',
    'address',
    'category',
    'city',
    'country',
    'customer',
    'film',
    'film_actor',
    'film_category',
    'inventory',
    'language',
    'payment',
    'payment_p2020_01',
    'payment_p2020_02',
    'payment_p2020_03',
    'payment_p2020_04',
    'payment_p2020_05',
    'payment_p2020_06',
    'rental',
    'staff',
    'store',
  ].includes(pagilaTable)
) {
  result = await client.queryObject(`SELECT * FROM ${pagilaTable};`);
} else {
  const param = 42;
  result = await client.queryObject(`SELECT ${param} as answer;`);
}
Copy Code
 

​​ Worker deployment

 

In

wrangler.tomlCopy Code
, enter your Cloudflare account ID in
account_idCopy Code
:

 
wrangler.tomlname = "worker-postgres-template"
type = "javascript"
account_id = ""
Copy Code
 

Publish your function:

 
wrangler publish
Built successfully, built project size is 10 KiB.
Successfully published your script to https://workers-postgres-template.example.workers.dev
 

​​ Set secrets

 

Next, create and save a Client ID. Here are the Client Secret Worker, which secrets in case your Tunnel is protected by Cloudflare Access.

 
wrangler secret put CF_CLIENT_ID
$ wrangler secret put CF_CLIENT_SECRET
 

​​ Test the Worker

 

Request for Pagila tables to add the

?pagila-table
parameter with a table name to URL of the Worker.

 
curl https://example.workers.dev/?pagila-table=actor
$ curl https://example.workers.dev/?pagila-table=address
curl https://example.workers.dev/?pagila-table=country
$ curl https://example.workers.dev/?pagila-table=language
 

Cleanup

 

Run the below command to stop and remove the Docker containers and networks:

 
docker compose down

# Stop and remove containers, networks


 

[Looking for a solution to another query? We are just a click away.]

 

Conclusion

To conclude, From this Cloudflare workers PostgreSQL article, you have seen how to fetch data in your Cloudflare Workers applications from a PostgreSQL database using a Postgres database connector.

 

PREVENT YOUR SERVER FROM CRASHING!

Never again lose customers to poor server speed! Let us help you.

Our server experts will monitor & maintain your server 24/7 so that it remains lightning fast and secure.

GET STARTED

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Get featured on the Bobcares blog and share your expertise with a global tech audience.

WRITE FOR US
server management

Spend time on your business, not on your servers.

TALK TO US

Or click here to learn more.

Speed issues driving customers away?
We’ve got your back!