Skip to content

Instantly share code, notes, and snippets.

@tiagosampaio
Last active May 22, 2024 20:05
Show Gist options
  • Save tiagosampaio/42e35990a32e567caf4e05ed2799c720 to your computer and use it in GitHub Desktop.
Save tiagosampaio/42e35990a32e567caf4e05ed2799c720 to your computer and use it in GitHub Desktop.
MySQL: Operations Cheat Sheet

MySQL Operations Cheat Sheet

Making backup all databases:

mysqldump -u user -p --all-databases > dump.sql

Making backup only one database:

mysqldump -u user -p --databases db1 > dump.sql

Making backup many databases:

mysqldump -u user -p --databases db1 db2 db... > dump.sql

Making backup with triggers:

mysqldump -u user -p --triggers --all-databases > dump.sql

Making backup with procedures and functions:

mysqldump -u user -p --routines --all-databases > dump.sql

Exporting only data from a single table:

mysqldump -u user -p --no-create-info [db_name] [table_name] > table.data.sql

Exporting only structure from a single table:

mysqldump -u user -p --no-data [db_name] [table_name] > table.structure.sql

Compress the SQL dump in real time with

gzip

Exporting a specific database

mysqldump -u user -p database | gzip > dump.sql.gz

Exporting all databases

mysqldump -u user -p --all-databases | gzip > dump.sql.gz

bzip2

We can still reach a higher compression ratio using the

mysqldump -u user -p --all-databases | bzip2 > dump.sql.bz2

Making a comparison between the dump with three compression options (none, gzip and bzip2) had the following result:

  • Dump no compression – 947k
  • Dump with gzip – 297k
  • Dump with bzip2 – 205k

And how do I restore the dump?

Normal:

mysql -u user -p < dump.sql

gzip

gunzip < dump.sql.gz | mysql -u user -p

bzip2

bunzip2 < dump.sql.bz2 | mysql -u user -p

zcat

zcat < dump.sql.bz2 | mysql -u user -p

More details about mysqldump can be found on MySQL Online Documentation.

Advanced Operations

First Step

First we create the database structure.

mysqldump -h [host] -u [user] -p [database] --no-tablespaces --skip-triggers --no-data | sed -e 's/DEFINER[ ]*=[ ]*[^*]*\*/\*/' > dump.sql

Second Step

Then, it's time to export the data from the database. You can use --ignore-table=[table] for as many tables as you want. The table structure will be created but will remain empty.

mysqldump -h [host] -u [user] -p [database] --no-tablespaces --skip-triggers --no-create-info --ignore-table=[database].admin_user_session --ignore-table=[database].mageplaza_smtp_log | sed -e 's/DEFINER[ ]*=[ ]*[^*]*\*/\*/' >> dump.sql

Third Step

Then, it's time to export the routimes. Here, no data nor structure are exported.

mysqldump -h [host] -u [user] -p [database] --no-tablespaces --routines --no-create-info --no-data --no-create-db --skip-opt | sed -e 's/DEFINER[ ]*=[ ]*[^*]*\*/\*/' >> dump.sql

Fourth Step

And finally we can compress the file to optimize the download.

gzip dump.sql

Change Character Set from latin1 to UTF8

Run the following command to determine the present character set of your database. Replace database_name below with your database name

mysql> SELECT default_character_set_name 
       FROM information_schema.SCHEMATA S 
       WHERE schema_name = "database_name";
+----------------------------+
| default_character_set_name |
+----------------------------+
| latin1                     |
+----------------------------+

Run the following command to change character set of MySQL database from latin1 to UTF8. Replace database_name with your database name

mysql> ALTER DATABASE database_name CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;

Troubleshooting Possible Errors

Sometimes you may run into some errors when performing some operations.

Skipping Column Statistics

mysqldump: Couldn't execute 'SELECT COLUMN_NAME, JSON_EXTRACT(HISTOGRAM, '$."number-of-buckets-specified"') FROM information_schema.COLUMN_STATISTICS WHERE SCHEMA_NAME = 'magento' AND TABLE_NAME = 'core_config_data';': Unknown table 'COLUMN_STATISTICS' in information_schema (1109)

Solution:

Use the option --skip-column-statistics to get this error skipped.

Example:

mysqldump -h 127.0.0.1 -u root -p --no-create-info --skip-column-statistics magento core_config_data > core_config_data.data.sql

Remove DEFINER clause from MySQL Dumps

There are times when you'll run into the following error:

1449 - The user specified as a definer ('web2vi'@'%') does not exist

This is because whenever you create a fresh dump from a database that has the DEFINER set, these definers will be exported in your dump and you need either remove them or create and grant permissions to the specified user on your databaase server. The easiest way is to remove them from the dump with the following command:

In Linux

sed -i 's/DEFINER=[^*]*\*/\*/g' mydump.sql

In macOs

sed -i '' 's/DEFINER=[^*]*\*/\*/g' mydump.sql

When you're gonna dump the file you can use the option --skip-definer to create the dump already without the definers:

mysqlpump -u user --password='password' -h localhost --skip-definer database > database-no-definers.sql

Read more about mysqlpump here: https://dev.mysql.com/doc/refman/5.7/en/mysqlpump.html#option_mysqlpump_skip-definer

More references here: http://oksoft.blogspot.com/2008/10/mysqldump-backups-and-transfers.html

Show the Size of a MySQL Databases

SELECT table_schema "DB Name", ROUND(SUM(data_length + index_length) / 1024 / 1024, 1) "DB Size (MB)" 
FROM information_schema.tables 
GROUP BY table_schema;

The result will be something like this:

DB Name DB Size (MB)
mysql 11.8
performance_schema 0.0
my_database 26071.4

Show the Size of MySQL Tables

Size of a specific table:

SELECT
	table_name AS `Table`,
	round(((data_length + index_length) / 1024 / 1024), 2) `Size in MB`
FROM
	information_schema.TABLES
WHERE
	table_schema = 'db_name'
	AND table_name = 'table_name';

Size of all tables, descending order:

SELECT
	table_schema AS `Database`,
	table_name AS `Table`,
	round(((data_length + index_length) / 1024 / 1024), 2) `Size in MB`
FROM
	information_schema.TABLES
ORDER BY
	(data_length + index_length)
	DESC;
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment