Skip to content

Instantly share code, notes, and snippets.

What would you like to do?
Quick code to export an entire database schema to RFC4180-compliant CSV files and wrap them in a Gzip-compressed tarball.
/* exportDatabaseToCSV
Author: Blake C. Miner
$db - must be a PDO connection
$database - string, the name of the schema to be backed up
$filename - string, the name of the *.tar file to be generated
function exportDatabaseToCSV($db, $database, $filename) {
//Delete files
if(file_exists($filename) )
if(file_exists($filename . ".gz") )
unlink($filename . ".gz");
//Get list of tables
$stmt = $db->prepare('select table_name from information_schema.tables where table_schema=?');
$stmt->execute(array($database) );
$archive = new PharData($filename, Phar::TAR); //Create archive
$info = array();
//Export each table
while($table = $stmt->fetch() )
$fileHandle = tmpfile();
$tstmt = $db->query("select * from `$table[0]`");
//Add column names to the first line of the CSV file
$columnNames = array();
for($i = 0; $i < $tstmt->columnCount(); $i++)
$meta = $tstmt->getColumnMeta($i);
$columnNames[] = $meta['name'];
fputcsv($fileHandle, $columnNames);
//Add rows, replacing null values with "NULL"
$numRows = 0;
while($row = $tstmt->fetch(PDO::FETCH_NUM) )
for($i = 0; $i < count($row); $i++)
if(is_null($row[$i]) )
$row[$i] = "NULL";
fputcsv($fileHandle, $row);
//Read the file back into $data in 16 MB chunks
fseek($fileHandle, 0);
$data = "";
while($chunk = fread($fileHandle, 1024 * 1024 * 16) )
$data .= $chunk;
//Add the data to the archive
$archive->addFromString($table[0].".csv", $data);
$info[$table[0]] = $numRows;
//Compress archive and cleanup
return $info;

This comment has been minimized.

Copy link
Owner Author

@bminer bminer commented Jun 20, 2012

This code is useful for exporting MySQL databases to CSV files when the CSV files must comply with RFC4180. MySQL-generated CSVs don't quite adhere to the standard.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment