Skip to content

Instantly share code, notes, and snippets.

@fantix
Created September 16, 2020 06:22
Show Gist options
  • Save fantix/001438bd4b8738283d5cdf0bf107e7df to your computer and use it in GitHub Desktop.
Save fantix/001438bd4b8738283d5cdf0bf107e7df to your computer and use it in GitHub Desktop.
Why you shouldn't use SQLAlchemy 1.3 with FastAPI
"""
This code demonstrates how FastAPI server freezes with SQLAlchemy under concurrency,
if you're mixing blocking operations with async calls inproperly.
(pip install sqlalchemy httpx fastapi uvicorn)
You will see 15 (not 20!) "Handling xxx", then the server freezes for minutes,
refusing to serve any request.
This is caused by "resource starvation" - the default SQLAlchemy connection pool size
is 10 + 5 max_overflow = 15. So when the first 15 requests are awaiting for the
asyncio.sleep(0.1) (this can be any async operations), the 16-20th requests will be
blocked at trying to acquire a new database connection. As SQLAlchemy 1.3 is not async,
this will freeze the whole server, including the first 15 requests.
In short, the main thread must never be blocked, ever, even if the blocking operation
doesn't seem to be I/O-bound.
"""
import asyncio
import os
import fastapi
import httpx
import uvicorn
from fastapi import Depends
from sqlalchemy import create_engine, select, Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker, Session
SQLALCHEMY_DATABASE_URL = os.environ.get("SQLALCHEMY_DATABASE_URL", "postgresql:///")
engine = create_engine(SQLALCHEMY_DATABASE_URL)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
class User(Base):
__tablename__ = "users"
id = Column(Integer, primary_key=True, index=True)
name = Column(String)
app = fastapi.FastAPI()
@app.get("/")
async def list_names(i: int, db: Session = Depends(get_db)):
users = db.execute(select([User])).fetchall()
print("Handling %s" % i)
await asyncio.sleep(0.1) # <----- simulating async work
return i, [user.name for user in users]
@app.on_event("startup")
def startup():
Base.metadata.create_all(engine)
async def request(i):
async with httpx.AsyncClient() as client:
resp = await client.get("http://127.0.0.1:8000/?i=%s" % i)
print(resp.json())
async def concurrent_request():
await asyncio.sleep(1)
fs = [request(i) for i in range(20)]
await asyncio.wait(fs)
asyncio.ensure_future(concurrent_request())
uvicorn.run(app, host="127.0.0.1", port=8000)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment