How to run Python RQ jobs on a DigitalOcean application sever?

301 Views Asked by At

I have a FastAPI application deployed on DigitalOcean, it has multiple API endpoints and in some of them, I have to run a scraping function as a background job using the RQ package in order not to keep the user waiting for a server response.

I've already managed to create a Redis database on DigitalOcean and successfully connect the application to it, but I'm facing issues with running the RQ worker. Here's the code, inspired from RQ's official documentation :

import redis
from rq import Worker, Queue, Connection

listen = ['high', 'default', 'low']

#connecting to DigitalOcean's redis db
REDIS_URL = os.getenv('REDIS_URL')
conn = redis.Redis.from_url(url=REDIS_URL)

#Create a RQ queue using the Redis connection
q = Queue(connection=conn)

with Connection(conn):
    worker = Worker([q], connection=conn) #This instruction works fine
    worker.work() #The deployment fails here, the DigitalOcean server crashes at this instruction

The worker/job execution runs just fine locally but fails in DO's server To what could this be due? is there anything I'm missing or any kind of configuration that needs to be done on DO's endpoint?

Thank you in advance!

I also tried to use FastAPI's BackgroundTask class. At first, it was running smoothly but the job stops running halfway through with no feedback on what was happening in the background from the class itself. I'm guessing it's due to a timeout that doesn't seem to have a custom configuration in FastAPI (perhaps because its background tasks are meant to be low-cost and fast).

I'm also thinking of trying Celery out, but I'm afraid I would run into the same issues as RQ.

2

There are 2 best solutions below

1
On

Create a configuration file using this command: sudo nano /etc/systemd/system/myproject.service

[Unit]
Description=Gunicorn instance to serve myproject
After=network.target

[Service]
User=user
Group=www-data
WorkingDirectory=/home/user/myproject
Environment="PATH=/home/user/myproject/myprojectvenv/bin"
ExecStart=/home/user/myproject/myprojectvenv/bin/gunicorn --workers 3 --bind unix:myproject.sock -m 007 wsgi:app


[program:rq_worker]
command=/home/user/myproject/myprojectvenv/bin/rq -A rq_worker -l info
directory=/home/user/myproject
autostart=true
autorestart=true
stderr_logfile=/var/log/celery.err.log
stdout_logfile=/var/log/celery.out.log


[Install]
WantedBy=multi-user.target
0
On

DigitalOcean's App Platform allows you to designate workers for your application as components. Below are the steps that have proven effective:

  • Install doctl on your machine to enable access to the platform via a command-line interface (CLI).
  • Utilize doctl to enumerate your applications and acquire your application's ID:
doctl apps list
  • Obtain the appspec.yaml file for your application using the following command:
doctl apps spec get <app-id> appspec.yaml
  • Modify your appspec.yaml file to incorporate your worker, as illustrated below:
- name: rq-worker
  environment_slug: python
  github:
    repo: <repository URL for the code>
    branch: <repository branch>
    deploy_on_push: true
  run_command: rq worker <additional arguments>
  • Upon completion of editing the file, proceed to update your application using the command:
doctl apps update <your-app-id> --spec appspec.yaml

Following these steps will ensure the execution of your worker. It is imperative to ensure that your Redis configuration is appropriately set up.

For further reading, please check out this article