Skip to content

Instantly share code, notes, and snippets.

@Bondifrench
Last active September 25, 2017 16:48
Show Gist options
  • Save Bondifrench/c38842684d797134bd14 to your computer and use it in GitHub Desktop.
Save Bondifrench/c38842684d797134bd14 to your computer and use it in GitHub Desktop.
Importing Excel table into Postgres
#!/usr/bin/env SQL
#Importing an Excel table into Postgres
# 1) Save Excel file in a .csv format with just 1 header
# ex:
example_file.csv
# 2) Save file in C:/PostgreSQL/9.3/data (important that this file is within "data")
# so Postgres has the right file permissions
# 3) Need to create an empty table schema capable of storing the data
# 4) ex:
create table example_table(gid numeric, province character varying(10), "name" character varying(68),
pop1996 numeric)
# 5) Now we can import the csv file into our empty table schema with the following query:
COPY example_table(gid, province, "name", pop1996)
FROM 'csv/example_file.csv'
WITH DELIMITER ','
CSV header
# 6) Dont't forget putting Primary keys to new table
@Bondifrench
Copy link
Author

When bulk loading, dropping and recreating indexes is far faster than leaving indexes in place and forcing thousands or millions of individual updates to them

Do a VACUUM ANALYZE before and after bulk loading data to be safe. If you dropped indexes be sure to recreate them before doing the ANALYZE

See http://blog.gtuhl.com/2009/04/18/bulk-data-loading-with-postgresql/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment