Skip to content

Instantly share code, notes, and snippets.

@esperlu
Created April 27, 2011 05:46
Show Gist options
  • Save esperlu/943776 to your computer and use it in GitHub Desktop.
Save esperlu/943776 to your computer and use it in GitHub Desktop.
MySQL to Sqlite converter
#!/bin/sh
# Converts a mysqldump file into a Sqlite 3 compatible file. It also extracts the MySQL `KEY xxxxx` from the
# CREATE block and create them in separate commands _after_ all the INSERTs.
# Awk is choosen because it's fast and portable. You can use gawk, original awk or even the lightning fast mawk.
# The mysqldump file is traversed only once.
# Usage: $ ./mysql2sqlite mysqldump-opts db-name | sqlite3 database.sqlite
# Example: $ ./mysql2sqlite --no-data -u root -pMySecretPassWord myDbase | sqlite3 database.sqlite
# Thanks to and @artemyk and @gkuenning for their nice tweaks.
mysqldump --compatible=ansi --skip-extended-insert --compact "$@" | \
awk '
BEGIN {
FS=",$"
print "PRAGMA synchronous = OFF;"
print "PRAGMA journal_mode = MEMORY;"
print "BEGIN TRANSACTION;"
}
# CREATE TRIGGER statements have funny commenting. Remember we are in trigger.
/^\/\*.*CREATE.*TRIGGER/ {
gsub( /^.*TRIGGER/, "CREATE TRIGGER" )
print
inTrigger = 1
next
}
# The end of CREATE TRIGGER has a stray comment terminator
/END \*\/;;/ { gsub( /\*\//, "" ); print; inTrigger = 0; next }
# The rest of triggers just get passed through
inTrigger != 0 { print; next }
# Skip other comments
/^\/\*/ { next }
# Print all `INSERT` lines. The single quotes are protected by another single quote.
/INSERT/ {
gsub( /\\\047/, "\047\047" )
gsub(/\\n/, "\n")
gsub(/\\r/, "\r")
gsub(/\\"/, "\"")
gsub(/\\\\/, "\\")
gsub(/\\\032/, "\032")
print
next
}
# Print the `CREATE` line as is and capture the table name.
/^CREATE/ {
print
if ( match( $0, /\"[^\"]+/ ) ) tableName = substr( $0, RSTART+1, RLENGTH-1 )
}
# Replace `FULLTEXT KEY` or any other `XXXXX KEY` except PRIMARY by `KEY`
/^ [^"]+KEY/ && !/^ PRIMARY KEY/ { gsub( /.+KEY/, " KEY" ) }
# Get rid of field lengths in KEY lines
/ KEY/ { gsub(/\([0-9]+\)/, "") }
# Print all fields definition lines except the `KEY` lines.
/^ / && !/^( KEY|\);)/ {
gsub( /AUTO_INCREMENT|auto_increment/, "" )
gsub( /(CHARACTER SET|character set) [^ ]+ /, "" )
gsub( /DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP|default current_timestamp on update current_timestamp/, "" )
gsub( /(COLLATE|collate) [^ ]+ /, "" )
gsub(/(ENUM|enum)[^)]+\)/, "text ")
gsub(/(SET|set)\([^)]+\)/, "text ")
gsub(/UNSIGNED|unsigned/, "")
if (prev) print prev ","
prev = $1
}
# `KEY` lines are extracted from the `CREATE` block and stored in array for later print
# in a separate `CREATE KEY` command. The index name is prefixed by the table name to
# avoid a sqlite error for duplicate index name.
/^( KEY|\);)/ {
if (prev) print prev
prev=""
if ($0 == ");"){
print
} else {
if ( match( $0, /\"[^"]+/ ) ) indexName = substr( $0, RSTART+1, RLENGTH-1 )
if ( match( $0, /\([^()]+/ ) ) indexKey = substr( $0, RSTART+1, RLENGTH-1 )
key[tableName]=key[tableName] "CREATE INDEX \"" tableName "_" indexName "\" ON \"" tableName "\" (" indexKey ");\n"
}
}
# Print all `KEY` creation lines.
END {
for (table in key) printf key[table]
print "END TRANSACTION;"
}
'
exit 0
@kumarpatel
Copy link

@ephraimumpan
https://github.com/dumblob/mysql2sqlite
Use this instead. Forked from this gist and improved upon.
Has instructions on usage as well.

@joonas-fi
Copy link

For exporting SQL databases (mysql/postgres/sqlite) to JSON (after which you can write a script to import somewhere else or just process the data), see: https://github.com/function61/sql2json

@mauvm
Copy link

mauvm commented Nov 10, 2017

The script doesn't handle CONSTRAINT(s) after a column where KEY(s) were removed in between. This causes SQLITE_ERROR: near FOREIGN: syntax error. Example output:

CREATE TABLE "foo" (
  "bar" varchar(32) NOT NULL
  CONSTRAINT "..." FOREIGN KEY ...
);

Notice the missing comma after the bar column definition? To fix this add:

  • removedKeys = 0 after line 57
  • if (prev == "" && removedKeys > 0) print " ," after line 75 (was 74)
  • removedKeys += 1 after line 86 (was 84)

@StarveTheEgo
Copy link

can somebody help me how to use mysql2sqlite.sh in converting mysql to sqlite3 because i really don't have any idea...please help me...from scratch tutorial on windows.....

ahaha, you spammed here same way too, lmfao

@Satish-A-Wadekar
Copy link

Satish-A-Wadekar commented Mar 23, 2019

i am facing this issue

./mysql2sqlite.sh: line 14: mysqldump: command not found

can anybody give the step by step execution of this script please ? i have never run any bash script file.

@ianarman1
Copy link

ianarman1 commented Jun 4, 2019

Hello,

I ran this script, and received

mysqldump: Error: 'Expression #6 of SELECT list is not in GROUP BY clause and contains nonaggregated column 'information_schema.FILES.EXTRA' which is not functionally dependent on columns in GROUP BY clause; this is incompatible with sql_mode=only_full_group_by' when trying to dump tablespaces memory

after running ls -l on the file, i can confirm there is data in the file.

@R46narok
Copy link

R46narok commented Oct 4, 2020

Hello,

I ran this script, and received

mysqldump: Error: 'Expression #6 of SELECT list is not in GROUP BY clause and contains nonaggregated column 'information_schema.FILES.EXTRA' which is not functionally dependent on columns in GROUP BY clause; this is incompatible with sql_mode=only_full_group_by' when trying to dump tablespaces memory

after running ls -l on the file, i can confirm there is data in the file.

Any progress?

@florianschroen
Copy link

If someone has runs into the same problem...

I had a problem with "umlauts" (äüöß...) while exporting a latin1 mysql database and converting to sqlite.

In the mysqldump, all umlauts were displayed as ? or combinations like <fe>.

The cause was awk, which could be fixed by converting the mysql dump from latin1 to utf8 before parsing.
Found the solution here

mysqldump  --compatible=ansi --skip-extended-insert --compact  "$@" | \
iconv -c -f latin1 -t utf8 | \
awk '
[...]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment