I wanted to export data from a few Django apps, held in a postgres database, and import that data into an sqlite3 database to run locally.
This worked for me, but I'm sure there are other wrinkles, depending on your use of postgres-specific features, etc.
-
Export the data from postgres
Specify the table(s) you want to export like this, e.g.:
--table=my_table_1 --table=my_table_2
My Django apps had labels that began with
spectator_
, so I used a regular expression to include all of those tables:
--table="spectator_*"
$ pg_dump --column-inserts --data-only --blobs --table=YOUR_TABLE_NAME YOUR_DATABASE_NAME > my_pg_dump.sql
-
Replace all mentions of the postgres schema The postgres INSERTs all mentioned the
public
schema, so I searched for:"INSERT INTO public."
and replaced with this (don't include the quotes in the search/replace!):
"INSERT INTO "
-
Remove or comment out any lines starting with
SELECT pg_catalog
-
Replace all boolean
true
orfalse
values with1
and0
. How you do this depends on your data. You could just search/replace, so long as "true" and "false" aren't used in any other context, such as in strings. You may need to work out a grep that will do this good enough for you. -
Delete or rename any existing sqlite database.
-
Create an empty Django database:
$ ./manage.py migrate
-
Then enter sqlite3 and read the data in (assuming your project expects the database name to be
db.sqlite3
, which I think is the Django default):$ sqlite3 db.sqlite3 sqlite> .read my_pg_dump.sql sqlite> .exit
-
If all goes well enough you might then just need to create the superuser:
$ ./manage.py createsuperuser