Skip to content

Instantly share code, notes, and snippets.

@SimonEast
Last active August 2, 2023 15:47
Show Gist options
  • Star 4 You must be signed in to star a gist
  • Fork 2 You must be signed in to fork a gist
  • Save SimonEast/cd5b40584c91743aa56f94c542b69d35 to your computer and use it in GitHub Desktop.
Save SimonEast/cd5b40584c91743aa56f94c542b69d35 to your computer and use it in GitHub Desktop.
PHP Example: Stream a CSV File to Browser with GZIP Compression (exporting from MySQL/PDO)
<?php
/**
* This script performs a full dump of a database query into
* CSV format and pipes it directly to the browser.
*
* - YES, the browser will save the CSV file to disk
* - YES, it should support large files without using massive amounts of memory
* - YES, it compresses the request using GZIP to reduce download time
*/
// Even though we're using unbuffered queries, we'll
// increase the memory limit just in case of issues
ini_set('memory_limit', '300M');
// Send browser headers to indicate that this should be *downloaded*
// instead of rendered in browser
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Cache-Control: private', false);
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="CSV Export ' . date('Y-m-d') . '.csv";');
header('Content-Transfer-Encoding: binary');
// Connect to DB..
// [Setup your PDO connection here...]
// We'll use an *unbuffered* query as we cannot afford
// to load all records into PHP memory at one time
$pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, FALSE);
// Run our SELECT query
// Fields returned here MUST match with CSV column headers specified below
$resultset = $pdo->query('
SELECT * FROM pledges
ORDER BY pledges.id ASC
# For testing only:
# LIMIT 500
');
// Stream all the records out to the browser in CSV format
// Flushing less often requires more server memory usage to maintain the buffer
// but is probably faster and does better gzip compression
$flushEveryXRows = 5000;
// Enable GZIP on-the-fly compression
// ini_set('zlib.output_compression_level', 9);
ob_start('ob_gzhandler');
$outputBuffer = fopen('php://output', 'w');
// Create header row of CSV
// MUST exactly match the order of SQL columns returned, above
fputcsv($outputBuffer, [
'Column 1',
'Column 2',
'Column 3',
// etc.
]);
// Dump all rows (must match headers above)
$count = 0;
while ($r = $resultset->fetch(PDO::FETCH_NUM)) {
fputcsv($outputBuffer, $r);
// Every X rows, flush data to browser to free up memory from PHP
// and ensure it doesn't exceed its memory_limit
if (++$count % $flushEveryXRows == 0) {
ob_flush();
flush();
}
}
fclose($outputBuffer);
@tmoy61
Copy link

tmoy61 commented Mar 25, 2020

Thank you Simon. This is brilliant. Exactly what I needed. I just used it to download a 150 MB file (1 M rows) in a few seconds.

Just one point that's not clear: I presume the close on line 72 automatically does a final ob_flush and flush. I put in explicit flushes just to be sure.

@ishmakembu
Copy link

How would I add the PDO connection, can you elaborate, i keep getting an error

@spaceknight
Copy link

@ishmakembu

try this:

$pdo = new PDO("mysql:host=localhost;dbname=DB_NAME", "DB_USER", "DB_PASS");

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment