Skip to content

Instantly share code, notes, and snippets.

@hungtrinh
Forked from SimonEast/Export CSV.php
Created July 24, 2019 03:01
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save hungtrinh/cd968e88859c615b58a756704667ccc2 to your computer and use it in GitHub Desktop.
Save hungtrinh/cd968e88859c615b58a756704667ccc2 to your computer and use it in GitHub Desktop.
PHP Example: Stream a CSV File to Browser with GZIP Compression (exporting from MySQL/PDO)
<?php
/**
* This script performs a full dump of a database query into
* CSV format and pipes it directly to the browser.
*
* - YES, the browser will save the CSV file to disk
* - YES, it should support large files without using massive amounts of memory
* - YES, it compresses the request using GZIP to reduce download time
*/
// Even though we're using unbuffered queries, we'll
// increase the memory limit just in case of issues
ini_set('memory_limit', '300M');
// Send browser headers to indicate that this should be *downloaded*
// instead of rendered in browser
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Cache-Control: private', false);
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="CSV Export ' . date('Y-m-d') . '.csv";');
header('Content-Transfer-Encoding: binary');
// Connect to DB..
// [Setup your PDO connection here...]
// We'll use an *unbuffered* query as we cannot afford
// to load all records into PHP memory at one time
$pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, FALSE);
// Run our SELECT query
// Fields returned here MUST match with CSV column headers specified below
$resultset = $pdo->query('
SELECT * FROM pledges
ORDER BY pledges.id ASC
# For testing only:
# LIMIT 500
');
// Stream all the records out to the browser in CSV format
// Flushing less often requires more server memory usage to maintain the buffer
// but is probably faster and does better gzip compression
$flushEveryXRows = 5000;
// Enable GZIP on-the-fly compression
// ini_set('zlib.output_compression_level', 9);
ob_start('ob_gzhandler');
$outputBuffer = fopen('php://output', 'w');
// Create header row of CSV
// MUST exactly match the order of SQL columns returned, above
fputcsv($outputBuffer, [
'Column 1',
'Column 2',
'Column 3',
// etc.
]);
// Dump all rows (must match headers above)
$count = 0;
while ($r = $resultset->fetch(PDO::FETCH_NUM)) {
fputcsv($outputBuffer, $r);
// Every X rows, flush data to browser to free up memory from PHP
// and ensure it doesn't exceed its memory_limit
if (++$count % $flushEveryXRows == 0) {
ob_flush();
flush();
}
}
fclose($outputBuffer);
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment