I need help understanding how Bull Queue (bull.js) processes concurrent jobs.
Suppose I have 10 Node.js instances that each instantiate a Bull Queue connected to the same Redis instance:
const bullQueue = require('bull');
const queue = new bullQueue('taskqueue', {...})
const concurrency = 5;
queue.process('jobTypeA', concurrency, job => {...do something...});
Does this mean that globally across all 10 node instances there will be a maximum of 5 (concurrency) concurrently running jobs of type jobTypeA
? Or am I misunderstanding and the concurrency setting is per-Node instance?
What happens if one Node instance specifies a different concurrency value?
Can I be certain that jobs will not be processed by more than one Node instance?
I spent a bunch of time digging into it as a result of facing a problem with too many processor threads.
The short story is that bull's concurrency is at a queue object level, not a queue level.
If you dig into the code the concurrency setting is invoked at the point in which you call
.process
on your queue object. This means that even within the same Node application if you create multiple queues and call.process
multiple times they will add to the number of concurrent jobs that can be processed.One contributor posted the following:
So the answer to your question is: yes, your processes WILL be processed by multiple node instances if you register process handlers in multiple node instances.