Skip to content

Instantly share code, notes, and snippets.

@jcouyang
Last active December 9, 2023 16:46
Show Gist options
  • Save jcouyang/632709f30e12a7879a73e9e132c0d56b to your computer and use it in GitHub Desktop.
Save jcouyang/632709f30e12a7879a73e9e132c0d56b to your computer and use it in GitHub Desktop.
Promise All with Limit of Concurrent N

The Promise All Problem

in case of processing a very large array e.g. Promise.all(A_VERY_LARGE_ARRAY_OF_XHR_PROMISE)

which would probably blow you browser memory by trying to send all requests at the same time

solution is limit the concurrent of requests, and wrap promise in thunk

Promise.allConcurrent(2)([()=>fetch('BLAH1'), ()=>fetch('BLAH2'),...()=>fetch('BLAHN')])

if set concurrent to 2, it will send request BLAH1 and BLAH2 at the same time

if BLAH1 return response and resolved, will immediatly send request to BLAH3

in this way promise sending at the same time always keep the limit 2 which we’ve just configed before

describe('promiseAllStepN', function(){
describe('3 tasks, and cucurrent is 2', function(){
let tasks;
beforeEach(function(){
tasks = range(3).map(x=>sinon.stub())
})
describe('1 is finish', function(){
it('will kickoff the third task', function(done){
tasks[0].returns(Promise.resolve(0))
let task2 = tasks[2]
tasks[1].returns(new Promise(resolve=>setTimeout(()=>{
expect(task2.called).to.be.equal(true)
resolve(1)
done()
}, 1000)))
tasks[2].returns(Promise.resolve(2))
return Promise.allConcurrent(2)(tasks).then(x=>console.log(x))
})
})
})
describe('10 tasks, and cucurrent is 3', function(){
let tasks;
beforeEach(function(){
tasks = range(10).map(x=>sinon.stub())
})
describe('1st is finish but 2nd stuck', function(){
it.only('final task will run before 2nd', function(done){
tasks.forEach((task,index) => task.returns(Promise.resolve(index)))
let task10 = tasks[9]
tasks[1].returns(new Promise(resolve=>setTimeout(()=>{
expect(task10.called).to.be.equal(true)
resolve(1)
done()
}, 1000)))
return Promise.allConcurrent(2)(tasks).then(x=>console.log(x))
})
})
})
})
function promiseAllStepN(n, list) {
let tail = list.splice(n)
let head = list
let resolved = []
let processed = 0
return new Promise(resolve=>{
head.forEach(x=>{
let res = x()
resolved.push(res)
res.then(y=>{
runNext()
return y
})
})
function runNext(){
if(processed == tail.length){
resolve(Promise.all(resolved))
}else{
resolved.push(tail[processed]().then(x=>{
runNext()
return x
}))
processed++
}
}
})
}
Promise.allConcurrent = n => list => promiseAllStepN(n, list)
@avuenja
Copy link

avuenja commented Nov 19, 2021

The pure JS implementation without the callback hell :)

const pAll = async (queue, concurrency) => {
  let index = 0;
  const results = [];

  // Run a pseudo-thread
  const execThread = async () => {
    while (index < queue.length) {
      const curIndex = index++;
      // Use of `curIndex` is important because `index` may change after await is resolved
      results[curIndex] = await queue[curIndex]();
    }
  };

  // Start threads
  const threads = [];
  for (let thread = 0; thread < concurrency; thread++) {
    threads.push(execThread());
  }
  await Promise.all(threads);
  return results;
};

and a simple use case:

const test = async () => {
  const urls = ["url1", "url2", "url3"];
  const res = await pAll(
    urls.map(url => () => makeHTTPRequest(url)),
    5
  );
  console.log(res);
};

This is awesome man! 🎉

@jacksonv1lle
Copy link

This is a great tool, thanks!

Not sure if it is intentional but I have noticed that if all the promises in a thread are rejected, the next thread is not executed. To prevent this I am catching the error and returning it.

results[curIndex] = await queue[curIndex]().catch(err => err);

@webshared
Copy link

@jacksonv1lle it happens because Promise.all rejects instantly upon any of promises reject. You can also wrap the call into try/catch:

try {
  results[curIndex] = await queue[curIndex]();
} catch(err) {
  console.log("ERROR", err);
  results[curIndex] = err;
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment