Skip to content

Instantly share code, notes, and snippets.

Created March 31, 2018 05:19
Show Gist options
  • Save psiofxt/a7fe2726a500925faf8426a37eedc9d9 to your computer and use it in GitHub Desktop.
Save psiofxt/a7fe2726a500925faf8426a37eedc9d9 to your computer and use it in GitHub Desktop.
import os
import asyncio
import uvloop
from asyncpg import connect, create_pool
from sanic import Sanic
from sanic.response import json, text
import psycopg2.pool
def jsonify(records):
return [dict(r.items()) for r in records]
app = Sanic(__name__)
DSN = ''
async def register_db(app, loop):
app.pool_1 = await create_pool(
DSN, loop=loop, min_size=10, max_size=10)
app.pool_2 = psycopg2.pool.PersistentConnectionPool(
dsn=DSN, minconn=10, maxconn=10)
async with app.pool_1.acquire() as connection:
await connection.execute('DROP TABLE IF EXISTS data')
await connection.execute("""CREATE TABLE data (
id serial primary key,
content varchar(50),
post_date timestamp
for i in range(0, 1000):
await connection.execute(f"""INSERT INTO data
(id, content, post_date) VALUES ({i}, {i}, now())""")
async def root_get(request):
async with app.pool_1.acquire() as connection:
results = await connection.fetch(TEST_SQL_QUERY)
payload = {'posts': jsonify(results)}
return json(payload)
async def psycopg2_select(request):
conn = app.pool_2.getconn()
conn.autocommit = True
with conn.cursor() as curs:
columns = [ for x in curs.description]
results = curs.fetchall()
payload = {'posts': [{x: y for x, y in zip(columns, result)} for result in results]}
return json(payload)
if __name__ == '__main__':'', port=8000, debug=False, access_log=False)
Copy link

psiofxt commented Mar 31, 2018

200 concurrent requests on the asyncpg pool is able to complete without a problem. The psycopg2 pool will fail every time due to limitations with connections to postgres and the fact that the connections are not being awaited in a coroutine.

Server Software:
Server Hostname:        localhost
Server Port:            8000

Document Path:          /asyncpg/select
Document Length:        49791 bytes

Concurrency Level:      200
Time taken for tests:   3.777 seconds
Complete requests:      1000
Failed requests:        0
Total transferred:      49884000 bytes
HTML transferred:       49791000 bytes
Requests per second:    264.74 [#/sec] (mean)
Time per request:       755.456 [ms] (mean)
Time per request:       3.777 [ms] (mean, across all concurrent requests)
Transfer rate:          12896.81 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   2.2      0       8
Processing:    48  686 340.2    746    2112
Waiting:       44  681 339.5    740    2110
Total:         49  687 340.2    746    2112

Percentage of the requests served within a certain time (ms)
  50%    746
  66%    753
  75%    757
  80%    759
  90%   1111
  95%   1395
  98%   1439
  99%   1468
 100%   2112 (longest request)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment