Skip to content

Instantly share code, notes, and snippets.

@th0j
Created September 14, 2018 01:19
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save th0j/1350c307aa53a7c401b85a00012fe51e to your computer and use it in GitHub Desktop.
Save th0j/1350c307aa53a7c401b85a00012fe51e to your computer and use it in GitHub Desktop.
Import a large sql dump file to a MySQL database from command line
#!/bin/sh
# store start date to a variable
imeron=`date`
echo "Import started: OK"
dumpfile="/home/bob/bobiras.sql"
ddl="set names utf8; "
ddl="$ddl set global net_buffer_length=1000000;"
ddl="$ddl set global max_allowed_packet=1000000000; "
ddl="$ddl SET foreign_key_checks = 0; "
ddl="$ddl SET UNIQUE_CHECKS = 0; "
ddl="$ddl SET AUTOCOMMIT = 0; "
# if your dump file does not create a database, select one
ddl="$ddl USE jetdb; "
ddl="$ddl source $dumpfile; "
ddl="$ddl SET foreign_key_checks = 1; "
ddl="$ddl SET UNIQUE_CHECKS = 1; "
ddl="$ddl SET AUTOCOMMIT = 1; "
ddl="$ddl COMMIT ; "
echo "Import started: OK"
time mysql -h 127.0.0.1 -u root -proot -e "$ddl"
# store end date to a variable
imeron2=`date`
echo "Start import:$imeron"
echo "End import:$imeron2"
@th0j
Copy link
Author

th0j commented Sep 14, 2018

Note: Import file SQL 5GB sucessful spent 40 minutes on Macbook pro 2015 i7 2.2, ram 16GB.
Previoustly, I use Linux with 8GB ram but still out of memory
References:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment