Skip to content

Instantly share code, notes, and snippets.

@tabacitu
Created December 29, 2019 10:20
  • Star 5 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
Star You must be signed in to star a gist
Save tabacitu/b22e8d733825f6e9cf3768f396f7efdf to your computer and use it in GitHub Desktop.
MySQL Dump Large Table Entries In Chunks
## This process can be used to export a huge DB table. I've used it to export a 24GB table,
## containing 146 million records, into chunks of 25 million records, each chunk taking up aprox 500mb each.
##
## Of course, USERNAME, DATABASENAME and TABLENAME need to be replaced with actual values.
## First thing when it's run, it'll ask for that user's password.
##
## COMMAND BREAKDOWN:
## mysqldump - u [username] -p [databasename] [tablename]
## --no-create-info # don't include CREATE and DROP statements in the dump
## --where="" # filter the dumped rows
## pv --progress --size 5000m # show a progress bar (assume 500mb total file size, show progress to that)
## gzip > filename # archive the dump (reduces the file size from 6 GB to 0.6 GB)
## dump the first 25 milion records of the table
mysqldump -u USERNAME -p DATABASENAME TABLENAME --no-create-info --where="id<25000000" | pv --progress --size 5000m | gzip > backup_0_to_25_mil.sql.gz
## dump the second 25 milion records of the table
mysqldump -u USERNAME -p DATABASENAME TABLENAME --no-create-info --where="id>=25000000 AND id<50000000" | pv --progress --size 5000m | gzip > backup_25_to_50_mil.sql.gz
## dump the third 25 milion records of the table
mysqldump -u USERNAME -p DATABASENAME TABLENAME --no-create-info --where="id>=50000000 AND id<75000000" | pv --progress --size 5000m | gzip > backup_50_to_75_mil.sql.gz
## dump the fourth 25 milion records of the table
mysqldump -u USERNAME -p DATABASENAME TABLENAME --no-create-info --where="id>=75000000 AND id<100000000" | pv --progress --size 5000m | gzip > backup_75_to_100_mil.sql.gz
## dump the fifth 25 milion records of the table
mysqldump -u USERNAME -p DATABASENAME TABLENAME --no-create-info --where="id>=100000000 AND id<125000000" | pv --progress --size 5000m | gzip > backup_100_to_125_mil.sql.gz
## dump the sixth 25 milion records of the table
mysqldump -u USERNAME -p DATABASENAME TABLENAME --no-create-info --where="id>=125000000 AND id<150000000" | pv --progress --size 5000m | gzip > backup_125_to_150_mil.sql.gz
## dump the seventh 25 milion records of the table
mysqldump -u USERNAME -p DATABASENAME TABLENAME --no-create-info --where="id>=150000000 AND id<175000000" | pv --progress --size 5000m | gzip > backup_150_to_175_mil.sql.gz
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment