Is que safe for writing jobs that handle a lot of data?

103 Views Asked by At

I am using Que with the Sequel gem, I am interested if it is safe to write jobs that need to handle a lot of data, too much data than what can be safely placed in 1 database transaction, such as import/export of 80k+ rows on a regular basis (I currently process records in 1k record transaction batches).

The thing I'm concerned about is if the gem/postgres does some kind of implicit transaction around the background worker execution which would potentially make the rollback segment go out of hand and crash the DB down in a swappy hell.

The reason why I'm asking this is, this line from the docs:

Safety - If a Ruby process dies, the jobs it's working won't be lost, or left in a locked or ambiguous state - they immediately become available for any other worker to pick up.

with me this screams "nested in a transaction", which if my fears are true could potentially result in silently wrapping my 80k records into the same rollback segment. I could try it on my laptop, but my laptop is far stronger then the production VM so I'm afraid it might successfully crunch on my dev environment and then gloriously crash in deployment.

can someone with similar Que experience help out?

Link: the same question on GH

1

There are 1 best solutions below

0
On

Answered by the Que devs:

There's no implicit transaction around each job - that guarantee is offered by locking the job id with an advisory lock. Postgres takes care of releasing the advisory lock for us if the client connection is lost, regardless of transactional state.