how to use bluebirdJS promises to make API requests in BATCHES instead of just limiting by concurrency

512 Views Asked by At

I have 1000 http API requests to be made. I have kept them all as promises in an array. I want to execute them in "BATCHES" of 100 at a time - not more than that to avoid hitting any API rate-limit / throttling etc.

While bluebirdJS provides the .map() function with the concurrency option what it does is it limits the number of calls made AT A TIME. Meaning it will ensure that no more than 100 concurrent requests are being worked on at a time - as soon as the 1st request resolves, it will begin processing the 101st request - it doesn't wait for all the 100 to resolve first before starting with the next 100.

The "BATCHING" behavior i am looking for is to first process the 100 requests, and ONLY AFTER all of the 100 requests have completed it should begin with the next 100 requests.

Does BlueBirdJS provide any API out of the box to handle batches this way?

1

There are 1 best solutions below

4
On

You can split big urls array to an array of batches. For each batch run Promise#map which will be resolved when all async operations are finished. And run these batches in sequence using Array#reduce.

let readBatch(urls) {
  return Promise.map(url => request(url));
}

let read(urlBatches) {
  return urlBatches.reduce((p, urls) => {
    return p.then(() => readBatch(urls));
  }, Promise.resolve());
}

const BATCH_SIZE = 100;
let urlBatches = [];
for (let i = 0; i < urls.length; i+= BATCH_SIZE) {
  let batch = array.slice(i, i + BATCH_SIZE);
  urlBatches.push(batch);
}

read(urlBatches)
   .then(() => { ... }) // will be called when all 1000 urls are processed