Skip to content

Instantly share code, notes, and snippets.

@shiawuen
Created December 29, 2011 15:05
Show Gist options
  • Save shiawuen/1534477 to your computer and use it in GitHub Desktop.
Save shiawuen/1534477 to your computer and use it in GitHub Desktop.
Sample to upload file by chunk
<!DOCTYPE HTML>
<html lang="en-US">
<head>
<meta charset="UTF-8">
<title>test upload by chunk</title>
</head>
<body>
<input type="file" id="f" />
<script src="script.js"></script>
</body>
</html>
(function() {
var f = document.getElementById('f');
if (f.files.length)
processFile();
f.addEventListener('change', processFile, false);
function processFile(e) {
var file = f.files[0];
var size = file.size;
var sliceSize = 256;
var start = 0;
setTimeout(loop, 1);
function loop() {
var end = start + sliceSize;
if (size - end < 0) {
end = size;
}
var s = slice(file, start, end);
send(s, start, end);
if (end < size) {
start += sliceSize;
setTimeout(loop, 1);
}
}
}
function send(piece, start, end) {
var formdata = new FormData();
var xhr = new XMLHttpRequest();
xhr.open('POST', '/upload', true);
formdata.append('start', start);
formdata.append('end', end);
formdata.append('file', piece);
xhr.send(formdata);
}
/**
* Formalize file.slice
*/
function slice(file, start, end) {
var slice = file.mozSlice ? file.mozSlice :
file.webkitSlice ? file.webkitSlice :
file.slice ? file.slice : noop;
return slice.bind(file)(start, end);
}
function noop() {
}
})();
@toraritte
Copy link

Thanks a lot for this gist! Too bad it took me a couple days to find it...:)

Copy link

ghost commented Sep 14, 2017

Nice. working. just wondering if this needs to be taken care of from the backend as well? Do I need to combine those chunks or is it a multipart upload? I see form to which you are attaching the data and sending. Looking forward to some explanation.

@pasadyaguy
Copy link

Just what I was looking for! Thanks!

@marmz
Copy link

marmz commented Nov 19, 2017

Nice job :)

Can i ask why do you send the chunks in a setTimeout() loop, instead of a "common" for() iterator?
Maybe there is some reason i don't get..??

@rizsi
Copy link

rizsi commented Nov 29, 2017

marmz: sending the data itself is asynchronous. Uploading the data will start shortly after the xhr.send(...) call while processing the JS goes on at once.

If you send the chunks in a for loop then all chunks are sent at once. Which opens N streams (N=file.size/chunksize). This would flood the server with all the pieces at once making the whole thing pointless.

The main reason for chunked upload is that the server does not need to store the whole file in memory - which is also possible to work around on the server side when data is streamed to file directly, the second reason is to make big file upload resumable in case the TCP stream breaks. Opening more TCP streams at the same time to the same server will not make things better in most cases so it is pointless.

Starting chunks in each second (just as in this example) is also not a good solution. You should send each chunk after the previous one has finished. To achieve this you have to use the load event to go on to the next chunk and other events to show error to user:

xhr.addEventListener("progress", updateProgress);
xhr.addEventListener("load", transferComplete); // Handler has to send the next chunk
xhr.addEventListener("error", transferFailed);
xhr.addEventListener("abort", transferCanceled);

See: https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/Using_XMLHttpRequest

@ufukomer
Copy link

Well working example! :)

@Dave3of5
Copy link

Dave3of5 commented Sep 5, 2019

Thanks so much for this! As noted above this has the possibility to overload the server so I have a variant that will only send once the current chunk is complete:

function send(file, start, end) {
    var formdata = new FormData();
    var xhr = new XMLHttpRequest();

    if (size - end < 0) { // Uses a closure on size here you could pass this as a param
        end = size;
    }
    if (end < size) {
        xhr.onreadystatechange = function () {
            if (xhr.readyState == XMLHttpRequest.DONE) {
                console.log('Done Sending Chunk');
                send(file, start + sliceSize, start + (sliceSize * 2))
            }
        }
    } else {
        console.log('Upload complete');
    }

    xhr.open('POST', '/uploadchunk', true);

    var slicedPart = slice(file, start, end);

    formdata.append('start', start);
    formdata.append('end', end);
    formdata.append('file', slicedPart);
    console.log('Sending Chunk (Start - End): ' + start + ' ' + end);

    xhr.send(formdata);
}

You can invoke it like so:

var file = $files[0]; // This is your file object
var size = file.size;
var sliceSize = 10000000; // Send 10MB Chunks
var start = 0;

console.log('Sending File of Size: ' + size);

send(file, 0, sliceSize);

I'm using this to send large files to a server without overloading the memory. I found that 10MB chunks seemed to work quite well the 256 byte chunks in the original were far too slow.

@i-oden
Copy link

i-oden commented Sep 18, 2019

@Dave3of5 Do you know how to apply this to multiple files?

@Dave3of5
Copy link

@inaod568 The code I've written is only designed for a single file upload.

To get it to work with multiple file you would have to send the filename with the formdata and then invoke send for each file starting at 0. In terms of keeping the memory footprint down on the server you should send the chunks one at a time which is done in my example. So you would need to wait until the send is fully complete before moving on to the next file.

Probably the best way to to wrap up the send in a promise. That's going to be a bit complicated given that send only completes when all the Ajax requests have completed.

@glengit
Copy link

glengit commented Oct 2, 2019

Hello,

How can I merge or compile into 1 file in Php?

I have this snippet but it doesn't seem to work.

$num = $_POST['num']; // 1,2,3,4
$tmp_name = $_FILES['upload']['tmp_name'];
$upload = $_SERVER["DOCUMENT_ROOT"] . '/chunks/upload/';
$filename = $_FILES['upload']['name'];
$new_file_name = 'chunk-' . $num . $filename; // pattern : chunk-1myfilename.jpg, chunk-2myfilename.jpg,
$target_file = $upload . $new_file_name;

move_uploaded_file( $tmp_name, $target_file );

for ( $i = 1; $i <= $total_chunks; $i++ ) {
$file = fopen( 'http://example/chunks/upload/' . 'chunk-' . $i . $filename, 'rb');
$buff = fread( $file, 1048576);
fclose( $file );

	$final = fopen( $upload . $filename, 'ab' );
	$write = fwrite($final, $buff);
	fclose( $final );

	unlink( $upload . 'chunk-' . $i . $filename );
}

Thanks.

Glen

@kanishkamehta-Aricent
Copy link

Nice. working. just wondering if this needs to be taken care of from the backend as well? Do I need to combine those chunks or is it a multipart upload? I see form to which you are attaching the data and sending. Looking forward to some explanation.

@thapliyalshivam It needs to be taken care from backend ?? or only frontend will work.

@ARPNetBR
Copy link

Hello Guys,
chunks are ok, but I cant re create file on backend, I'm using php and I'm trying to upload an image, any tips?

@lingtalfi
Copy link

Nice function, but from my tests, the setTimeout with 1ms is too fast and the chunks DO NOT ARRIVE IN ORDER in my server.
I had to boost it to 50ms (40ms was still failing). I guess this really depends on what connectivity you have, one could also augment the slice size I suppose.

So, just be aware of that when you implement this function in your code...

@jasekiw
Copy link

jasekiw commented Apr 20, 2020

As noted by others, the snippet doesn't take into fact whether the last request has completed or not. It also does not handle network failures. I recommend using an out of the box solution or putting in some work on this proof of concept to get something usable.

The snippet from this comment https://gist.github.com/shiawuen/1534477#gistcomment-3017438 fixes it so that it doesn't overload the server. The server has to stitch the file back together for each piece that is sent.

I recommend using a library to handle this like http://www.resumablejs.com/

@woodchucker
Copy link

woodchucker commented Jun 20, 2021

Hello Guys,
chunks are ok, but I cant re create file on backend, I'm using php and I'm trying to upload an image, any tips?

Hi, I Hope to Remember your question until I have some free time to post a solution that I code my own in PHP It work with test file size untill 7 GB the Whole process (chunk and merge) are handled in PHP server side.

@phanngoctuan1990
Copy link

Hello Guys,
chunks are ok, but I cant re create file on backend, I'm using php and I'm trying to upload an image, any tips?

Hi, I Hope to Remember your question until I have some free time to post a solution that I code my own in PHP It work with test file size untill 7 GB the Whole process (chunk and merge) are handled in PHP server side.

Hi @woodchucker Can you let me know about your solution? Now i can upload file size until 250M (5M records)
Thanks.

@woodchucker
Copy link

woodchucker commented Aug 26, 2021

I open a new repo to share my solution: https://github.com/woodchucker/SplitMergePHP.git

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment