Skip to content

Instantly share code, notes, and snippets.

@shawnbot
Created March 20, 2013 23:50
Show Gist options
  • Star 2 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save shawnbot/5209568 to your computer and use it in GitHub Desktop.
Save shawnbot/5209568 to your computer and use it in GitHub Desktop.
queue + d3 = asynchronous, parallelized, event-dispatching load queue
var loadQueue = function() {
var q = queue(),
defer = q.defer,
dispatch = d3.dispatch("progress", "complete"),
requests = [];
q.defer = function(load, url) {
return defer(function(callback) {
var req = load(url, function(error, data) {
req.loaded = req.total;
req.progress = 1;
update();
callback.apply(null, arguments);
})
.on("progress", function() {
var e = d3.event;
req.loaded = e.loaded;
req.total = e.total;
req.progress = e.loaded / e.total;
});
req.total = req.loaded = req.progress = 0;
requests.push(req);
});
};
function update() {
var total = 0,
loaded = 0,
progress = 0;
requests.forEach(function(req) {
total += req.total;
loaded += req.loaded;
progress += req.progress;
});
progress /= requests.length;
dispatch.progress({
total: total,
loaded: loaded,
progress: progress
});
if (progress >= 1) {
dispatch.complete({
loaded: loaded
});
}
}
d3.rebind(q, dispatch, "on");
return q;
};
@mbostock
Copy link

mbostock commented Apr 1, 2013

Your progress is currently evenly weighted across all files. You might want to weight the progress by the size of the file, so that if you’re downloading files of vastly different sizes the progress estimation is more accurate. One way you could do that would be to compute the sum loaded and sum total across all tasks, and then use this sum to compute the overall progress—rather averaging per-request progress.

@shawnbot
Copy link
Author

shawnbot commented Apr 3, 2013

Good catch, Mike. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment