Skip to content

Instantly share code, notes, and snippets.

@pad92
Last active December 5, 2023 12:46
Show Gist options
  • Save pad92/2875906 to your computer and use it in GitHub Desktop.
Save pad92/2875906 to your computer and use it in GitHub Desktop.
MySQL Import (MultiThread)
#!/bin/sh
# Run it into backup directory.
# export file format : base-table....sql.gz
for FILE in $(ls -1 *.sql.gz) ; do
DATABASE=$(echo $FILE | cut -d\- -f1)
TABLE=$(echo $FILE | cut -d\- -f2 | cut -d\. -f1)
PS=$(mysql -e 'show processlist\G' | grep ^Command | wc -l)
while [ $PS -gt 100 ]; do
sleep 5
PS=$(mysql -e 'show processlist\G' | grep ^Command | wc -l)
done
mysql -e "TRUNCATE TABLE $DATABASE.$TABLE"
zcat $FILE | mysql $DATABASE &
done
@killrazor
Copy link

Why 100 processes? What is the limiting factor or how did you come to that number?

@pad92
Copy link
Author

pad92 commented Dec 4, 2019

Why 100 processes? What is the limiting factor or how did you come to that number?

to avoid overloading the server

@Logicbloke
Copy link

This is more like one table per thread and not multiple threads for the same table.

@leroy0211
Copy link

leroy0211 commented Dec 5, 2023

It's an old gist, but if you add wait at the end of the command, the command does not finish until all tables have been imported. Without the wait there could still be running background processes, which you don't know about.
Also if you run this over an ssh terminal, the subprocesses will be stopped when you disconnect your ssh session.

@pad92
Copy link
Author

pad92 commented Dec 5, 2023

It's an old gist, but if you add wait at the end of the command, the command does not finish until all tables have been imported. Without the wait there could still be running background processes, which you don't know about. Also if you run this over an ssh terminal, the subprocesses will be stopped when you disconnect your ssh session.

yes, you can :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment