I am using Que with the Sequel gem, I am interested if it is safe to write jobs that need to handle a lot of data, too much data than what can be safely placed in 1 database transaction, such as import/export of 80k+ rows on a regular basis (I currently process records in 1k record transaction batches).
The thing I'm concerned about is if the gem/postgres does some kind of implicit transaction around the background worker execution which would potentially make the rollback segment go out of hand and crash the DB down in a swappy hell.
The reason why I'm asking this is, this line from the docs:
Safety - If a Ruby process dies, the jobs it's working won't be lost, or left in a locked or ambiguous state - they immediately become available for any other worker to pick up.
with me this screams "nested in a transaction", which if my fears are true could potentially result in silently wrapping my 80k records into the same rollback segment. I could try it on my laptop, but my laptop is far stronger then the production VM so I'm afraid it might successfully crunch on my dev environment and then gloriously crash in deployment.
can someone with similar Que experience help out?
Link: the same question on GH
Answered by the Que devs: