My colleague has written celery tasks, necessary configuration in settings file, also supervisors config file. Everything is working perfectly fine. The projects is handed over to me and I seeing some issues that I have to fix.
There are two projects running on a single machine, both projects are almost same, lets call them projA
and projB
.
supervisord.conf file is as:
;for projA
[program:celeryd]
directory=/path_to_projA/
command=celery -A project worker -l info
...
[program:celerybeat]
directory=/path_to_projA/
command=celery -A project beat -l info
...
; For projB
[program:celerydB]
directory=/path_to_projB/
command=celery -A project worker -l info
...
[program:celerybeatB]
directory=/path_to_projB/
command=celery -A project beat -l info
...
The issue is, I am creating tasks through a loop and only one task is received from celeryd
of projA
, and remaining task are not in received (or could be received by celeryd
of projB
).
But when I stop celery programs for projB
everything works well. Please note, the actual name of django-app is project hence celery -A project worker/beat -l info.
Please bare, I am new to celery, any help is appreciated. TIA.
As the Celery docs says,
When multiple tasks are created through a loop, tasks are evenly distributed to two different workers ie worker of
projA
and worker ofprojB
since your workers are same.If projects are similar or as you mentioned almost same, you can use
Celery Queue
but of course your queues across projects should be different. Celery Docs for the same is provided here.You need to set
CELERY_DEFAULT_QUEUE
,CELERY_DEFAULT_ROUTING_KEY
andCELERY_QUEUES
in your settings.py file.And your
supervisor.conf
file needs queue name in the commands line for all the programs.For Ex:
command=celery -A project beat -l info -Q <queue_name>
And that should work, based on my experience.