I need help understanding how Bull Queue (bull.js) processes concurrent jobs.
Suppose I have 10 Node.js instances that each instantiate a Bull Queue connected to the same Redis instance:
const bullQueue = require('bull');
const queue = new bullQueue('taskqueue', {...})
const concurrency = 5;
queue.process('jobTypeA', concurrency, job => {...do something...});
Does this mean that globally across all 10 node instances there will be a maximum of 5 (concurrency) concurrently running jobs of type jobTypeA? Or am I misunderstanding and the concurrency setting is per-Node instance?
What happens if one Node instance specifies a different concurrency value?
Can I be certain that jobs will not be processed by more than one Node instance?
Ah Welcome! This is a meta answer and probably not what you were hoping for but a general process for solving this:
I personally don't really understand this or the guarantees that bull provides. Since it's not super clear:
Dive into source to better understand what is actually happening. I usually just trace the path to understand:
If the implementation and guarantees offered are still not clear than create test cases to try and invalidate assumptions it sounds like:
IMO the biggest thing is:
If exclusive message processing is an invariant and would result in incorrectness for your application, even with great documentation, I would highly recommend to perform due diligence on the library :p