Navigation Menu

Skip to content

Instantly share code, notes, and snippets.

@boscacci
Created November 8, 2019 22:57
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save boscacci/d5979313bc9e2e35edbed246a126341e to your computer and use it in GitHub Desktop.
Save boscacci/d5979313bc9e2e35edbed246a126341e to your computer and use it in GitHub Desktop.
Python Pandas - Write Very Large CSV file to (postgresql) SQL DB
import pandas as pd
from sqlalchemy import create_engine
# Your connection string format may vary by SQL flavor;
con = f'postgresql://{username}:{password}@{host}:5432/{database}'
eng = create_engine(con)
# Open csv file as stream and write to SQL, appending as you go:
for chunk in pd.read_csv('filename.csv', chunksize = 1000):
chunk.to_sql(name = 'giant_table',
con = eng,
if_exists = 'append')
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment