How to call a Celery task in another docker container/queue?

50 Views Asked by At

I am trying to recreate a legacy app that has been down for ages. There is no documentation, and no one left on the team that knows anything about this.

The general idea is that there are several Celery worker processes, each with their own Dockerfile and docker container.

Generally, the celery init in the code is something like:

worker = Celery(
    broker_url=settings.CELERY_BROKER_URL,
    result_backend=settings.CELERY_RESULT_BACKEND,
)

And in the docker-compose.yaml, it'll be something like this to set up different queues:

command: celery -A org_name.app_name.tasks worker -l INFO -E -Q org_name.app_name.tasks -n app_name%h

When I look at the Celery worker log output, these init strings seem to work, and I see registered celery tasks. However, if I'm on worker_A and I do:

worker_A_celery_app.send_task("org_name.worker_B.tasks.worker_B_task")

I get the AsyncResult<> output, but within my Celery Flower instance, I see no task get created/sent/registered.

How can I use this queueing to call tasks across my celery network on the same broker?

0

There are 0 best solutions below