Skip to content

Instantly share code, notes, and snippets.

@brianc
brianc / index.js
Created March 17, 2025 18:27
esm with pg
import pg from "pg";
const client = new pg.Client();
await client.connect();
const { rows } = await client.query("SELECT NOW()");
console.log("rows:", rows);
await client.end();
@brianc
brianc / gist:f906bacc17409203aee0
Last active December 22, 2023 00:47
Some thoughts on node-postgres in web applications

Some thoughts on using node-postgres in a web application

This is the approach I've been using for the past year or so. I'm sure I'll change and it will change as I grow & am exposed to more ideas, but it's worked alright for me so far.

Pooling:

I would definitely use a single pool of clients throughout the application. node-postgres ships with a pool implementation that has always met my needs, but it's also fine to just use the require('pg').Client prototype and implement your own pool if you know what you're doing & have some custom requirements on the pool.

Using pg.connect is the way to go in a web environment.

PostgreSQL server can only handle 1 query at a time per conenction. That means if you have 1 global new pg.Client() connected to your backend your entire app is bottleknecked based on how fast postgres can respond to queries. It literally will line everything up, queuing each query. Yeah, it's async and so that's alright...but wouldn't you rather multiply your throughput by 10x? Use pg.connect set the pg.defaults.poolSize to something sane (we do 25-100, not sure the right number yet).

new pg.Client is for when you know what you're doing. When you need a single long lived client for some reason or need to very carefully control the life-cycle. A good example of this is when using LISTEN/NOTIFY. The listening client needs to be around and connected and not shared so it can properly handle NOTIFY messages. Other example would be when opening up a 1-off client to kill some hung stuff or in command line scripts.

@brianc
brianc / node-postgres-debug.js
Created November 19, 2010 17:47
debugging node postgres
var Client = require(__dirname + '/node-postgres/lib').Client;
var client = new Client({
host: '1.0.1.1',
port: 6543,
user: 'Kev',
database: 'dev',
password: '...'
});
@brianc
brianc / gist:5547726
Created May 9, 2013 14:19
a transaction with node-postgres - this is how _I_ write code which uses node-postgres. YMMV.
//this is an example of how you might run a transaction
//in postgres with node. This example has as little
//abstraction as possible. In your app you'll
//likely want to build a layer on top to ease some of the
//callback management and reduce code duplication
var log = require('logged')(__filename);
var pg = require('pg');
@brianc
brianc / gist:1663864
Created January 23, 2012 15:40
monkey-patch node-postgres to log queries
var Query = require('pg').Query;
var actualSubmit = Query.prototype.submit;
Query.prototype.submit = function() {
console.log(this.text);
actualSubmit.apply(this, arguments);
}
git push -u origin $(git symbolic-ref --short HEAD)
#!/bin/bash
branch=$(git branch | sed -n -e 's/^\* \(.*\)/\1/p')
git fetch
git rebase origin/master
git rebase -i HEAD~`git log origin/master.. --pretty=oneline | wc -l | sed -e 's/^[ \t]*//'`
git push origin $branch --force
git checkout master
pg.connect((err, client, done) {
client.query('set role foo')
client.query('select * from secret', function(err, rows) {
done()
})
})
@brianc
brianc / gist:d825fd26c672781e4534
Last active June 29, 2016 00:44
How we load our elasticsearch indices
//stream a postgres table directly into an elastic search index
//you need to have your index created and your type mappings defined in elastic search
//and then you just do this:
//bash: npm i pg-readable pg.js clumpy stretchypants
var es = require('stretchypants').index('your_index').type('your_type_mapping')
var clumpy = require('clumpy')
var query = require('pg-readable')