I am using on a project Django + celery + beat. Is there a way how to reduce number or requests made to redis data store?
THe project runs on fly.io. Part of the fly.toml
[processes]
app = "gunicorn --bind :8000 zzrec.wsgi"
beat = "celery -A zzrec beat -l info -S django"
celery = "celery -A zzrec worker -l INFO"
The project - dev version - is running on free tier on fly. But I am getting ResponseError: max daily request limit exceeded. Limit: 10000, Usage: 10000. message from upstash https://upstash.com/docs/redis/troubleshooting/max_daily_request_limit
So I want to find out if it is possible to modify settings of celery or beat so it will not thit 10.000 requests.
I need celery to run "background" tasks but expecially for dev environment it would be good enough if the are "checked" every minute or so. Not every second.
UPDATE I can see that someone already voted to close this question.. So to make sure
- I am not the author of the code of the working project mentioned above
- I am owner of the project but I am not a programer
- I DID Google search and did not find answer to my question
- the programer that set up Django + celery + beat does not know answer to that question
Check first if adjusting the
CELERY_BEAT_SCHEDULEcould help, especially if your main goal is to manage the frequency at which tasks are scheduled and reduce the number of requests made to Redis. (a bit like in "How do I run periodic tasks with celery beat?")Since each task execution potentially involves communication with Redis (for task state management and result storage), reducing the frequency of tasks should lead to a proportional reduction in Redis requests.
Adjusting the beat schedule is generally straightforward and does not require deep changes to your application's codebase. It is a matter of configuration changes rather than code changes.
That configuration would schedule
some_taskto run every 10 minutes, reducing the frequency from a potential default of running every minute or even more frequently.The
CELERY_BEAT_SCHEDULEconfiguration is typically done within your Django project's settings, most often thesettings.py, which contains various configurations for your Django application.In your project's
__init__.py, make sure Celery is initialized. This is typically done by creating a new Celery instance and loading the settings from Django:When you start your Django project, you also need to run Celery Beat alongside your Celery workers:
To process the tasks, you need to run the Celery worker:
Overriding the
is_duemethod of a custom schedule class allows you to define your own logic for determining when a task is due to run next. This can be a powerful feature, especially if you need a level of control that goes beyond static schedules defined inCELERY_BEAT_SCHEDULE.You have to define your custom schedule class. That class should inherit from
celery.schedules.scheduleor an appropriate base class, and you should override theis_duemethod to implement your custom scheduling logic.With
schedules.py:Next, use instances of your custom schedule class in the
CELERY_BEAT_SCHEDULEconfiguration. You create an instance of your custom class and pass it as thescheduleparameter.Make sure that your Celery application is correctly loading the configuration. If you are using a Django project, this typically means ensuring your
settings.pyis properly set up and that yourcelery.pyfile correctly loads these settings.After defining your custom schedule class, you can import it in your
settings.pyand use it inCELERY_BEAT_SCHEDULE.After making these changes, restart your Celery worker and beat processes to pick up the new schedule.