-
-
Save shiawuen/1534477 to your computer and use it in GitHub Desktop.
<!DOCTYPE HTML> | |
<html lang="en-US"> | |
<head> | |
<meta charset="UTF-8"> | |
<title>test upload by chunk</title> | |
</head> | |
<body> | |
<input type="file" id="f" /> | |
<script src="script.js"></script> | |
</body> | |
</html> |
(function() { | |
var f = document.getElementById('f'); | |
if (f.files.length) | |
processFile(); | |
f.addEventListener('change', processFile, false); | |
function processFile(e) { | |
var file = f.files[0]; | |
var size = file.size; | |
var sliceSize = 256; | |
var start = 0; | |
setTimeout(loop, 1); | |
function loop() { | |
var end = start + sliceSize; | |
if (size - end < 0) { | |
end = size; | |
} | |
var s = slice(file, start, end); | |
send(s, start, end); | |
if (end < size) { | |
start += sliceSize; | |
setTimeout(loop, 1); | |
} | |
} | |
} | |
function send(piece, start, end) { | |
var formdata = new FormData(); | |
var xhr = new XMLHttpRequest(); | |
xhr.open('POST', '/upload', true); | |
formdata.append('start', start); | |
formdata.append('end', end); | |
formdata.append('file', piece); | |
xhr.send(formdata); | |
} | |
/** | |
* Formalize file.slice | |
*/ | |
function slice(file, start, end) { | |
var slice = file.mozSlice ? file.mozSlice : | |
file.webkitSlice ? file.webkitSlice : | |
file.slice ? file.slice : noop; | |
return slice.bind(file)(start, end); | |
} | |
function noop() { | |
} | |
})(); |
As noted by others, the snippet doesn't take into fact whether the last request has completed or not. It also does not handle network failures. I recommend using an out of the box solution or putting in some work on this proof of concept to get something usable.
The snippet from this comment https://gist.github.com/shiawuen/1534477#gistcomment-3017438 fixes it so that it doesn't overload the server. The server has to stitch the file back together for each piece that is sent.
I recommend using a library to handle this like http://www.resumablejs.com/
Hello Guys,
chunks are ok, but I cant re create file on backend, I'm using php and I'm trying to upload an image, any tips?
Hi, I Hope to Remember your question until I have some free time to post a solution that I code my own in PHP It work with test file size untill 7 GB the Whole process (chunk and merge) are handled in PHP server side.
Hello Guys,
chunks are ok, but I cant re create file on backend, I'm using php and I'm trying to upload an image, any tips?Hi, I Hope to Remember your question until I have some free time to post a solution that I code my own in PHP It work with test file size untill 7 GB the Whole process (chunk and merge) are handled in PHP server side.
Hi @woodchucker Can you let me know about your solution? Now i can upload file size until 250M (5M records)
Thanks.
I open a new repo to share my solution: https://github.com/woodchucker/SplitMergePHP.git
Nice function, but from my tests, the setTimeout with 1ms is too fast and the chunks DO NOT ARRIVE IN ORDER in my server.
I had to boost it to 50ms (40ms was still failing). I guess this really depends on what connectivity you have, one could also augment the slice size I suppose.
So, just be aware of that when you implement this function in your code...