Skip to content

Instantly share code, notes, and snippets.

@jasny
Last active February 15, 2024 16:13
Show Gist options
  • Star 95 You must be signed in to star a gist
  • Fork 26 You must be signed in to fork a gist
  • Save jasny/1608062 to your computer and use it in GitHub Desktop.
Save jasny/1608062 to your computer and use it in GitHub Desktop.
Split MySQL dump SQL file into one file per table or extract a single table
#!/bin/bash
####
# Split MySQL dump SQL file into one file per table
# based on http://blog.tty.nl/2011/12/28/splitting-a-database-dump
####
if [ $# -lt 1 ] ; then
echo "USAGE $0 DUMP_FILE [TABLE]"
exit
fi
if [ $# -ge 2 ] ; then
csplit -s -ftable $1 "/-- Table structure for table/" "%-- Table structure for table \`$2\`%" "/-- Table structure for table/" "%40103 SET TIME_ZONE=@OLD_TIME_ZONE%1"
else
csplit -s -ftable $1 "/-- Table structure for table/" {*}
fi
[ $? -eq 0 ] || exit
mv table00 head
FILE=`ls -1 table* | tail -n 1`
if [ $# -ge 2 ] ; then
mv $FILE foot
else
csplit -b '%d' -s -f$FILE $FILE "/40103 SET TIME_ZONE=@OLD_TIME_ZONE/" {*}
mv ${FILE}1 foot
fi
for FILE in `ls -1 table*`; do
NAME=`head -n1 $FILE | cut -d$'\x60' -f2`
cat head $FILE foot > "$NAME.sql"
done
rm head foot table*
@adam-jones-net
Copy link

Any tool that could export all tables from a specific database inside of a multi database dump would be absolutely amazing. If anyone knows of such a tool please share ;)

@exabrial
Copy link

This script doesn't work if there is more than 100 tables. If there is more than 100 tables, tail -n 1 doesn't get the last files (as ls doesn't sort files by number : table101 is before table29). To fix it, add -n4 in csplit to get the number formatted to 4 number and change line #21 to mv table0000 head.

which csplit command? there are several :(

@m-faraz
Copy link

m-faraz commented Feb 2, 2024

there is a simple fix to those saying issue with more than a 100 files: -

  1. add -n5 in both csplit commands to allow upto 5 digits zero-spaced in the name.
  2. change mv table00 head to mv table00000 head.

Now this will handle upto 100000 tables starting from 00000 to 99999.
Upgrade as required!

Bonus: add rm $FILE after cat head $FILE foot to remove the old file as well so as to save some space if there are too many files. My scenario had over 2500 tables with the base sql file at 130GB!

Bonus 2 hours later: for the really lazy ones, heres a link to my version of this script including output and timing
#https://gist.github.com/m-faraz/4e32cbae283e3a9454118cd1d281f8cf

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment