Running laravel queue:work on a shared hosting

35.9k Views Asked by At

I want my laravel queue:work to keep running on a shared hosting, this is a shared hosting (am not on a VPS) I can't install anything because almost all online sources when I was searching for this suggest installing supervisor. So basically I thought I could just create a cron job that checks if queue is not running and starts it, any help on how to go about this because am kinda stuck, thanks.

P.S. Am on a Linux server and Laravel 5.3

9

There are 9 best solutions below

10
On

As you mentioned you are using shared hosting, You follow the below steps.

Step 1. you need to setup your queue driver as database

Step 2 you need to setup a cron job with with the following command php /path/to/application/artisan queue:work --queue=high,default.

You can give a try. I hope it will work.

0
On

I figured a hack to accomplish this

On file Illuminate\Queue\Worker.php my current laravel version (5.3) is on line 151; on function runNextJob($connectionName, $queue, WorkerOptions $options) add else as below

if ($job) {
    return $this->process(
        $connectionName, $job, $options
    );
 } else {
    $this->stop();
}

Now create cron job that will run the number of times you like with command php artisan queue:work the moment the queue is exhausted, it will exit (but should be frequent as the process exists)

UPDATE: Using task schedular with withoutOverlapping() prevents further calls of the command if its already running, so this is a better option considering the previous one is a change you have to make everytime you composer install or composer update

1
On

Try adding two cron jobs:

  1. To clear cache:
/usr/local/bin/php /home/<path-to-project>/artisan cache:clear
  1. To run the scheduler
/usr/local/bin/php /home/<path-to-project>/artisan schedule:run

Also, make sure that your app\Console\Kernel.php is having something like

protected function schedule(Schedule $schedule)
    {
        $schedule->command('queue:work --tries=3')
            ->cron('* * * * *')
            ->withoutOverlapping(5);
    }

Remove the first job (the one to remove cache) if the queue starts working

2
On

the best way is to set followin command on your tasks of panel (I'm using plesk control panel, it makes me able to set task there)

php artisan queue:work --once

Note: in my shared host, I must set following values because of their server configuration:

  1. php: /opt/plesk/php/7.2/bin/php -q
  2. artisan: /var/www/vhosts/t4f.ir/httpdocs/artisan
  3. my command: then I should write the command

so, the result would be like this:

/opt/plesk/php/7.2/bin/php -q /var/www/vhosts/t4f.ir/httpdocs/artisan queue:work --once

there is another option for runtime which I set to Cron type with value of: * * * * * which means, every minute this code will be executed. as I used --once in end of my commad, once it execute the command and job has been finished, it will be terminated. regarding to concurrent execution, I'm not worried about beacuase it's handling in queueing system and it's responsibility of this system.

0
On

// step 1: To run the scheduler

/usr/local/bin/php /home/USERNAME/public_html/PROJECT_FOLDER/artisan schedule:run >> /dev/null 2>&1

// step 2 in kernel.php

protected function schedule(Schedule $schedule)
    {
        $schedule->command('queue:work', [
        '--max-time' => 300
        ])->withoutOverlapping();

// step 3 test it. ;)

0
On

With Laravel 8 you have the option --max-time for queue:work that allows it to stop job processing after some amount of time.

php artisan queue:work --max-time=300

You can setup a cron job, for example, every 5 minutes to run job processing and set the --max-time option to 5 minutes (300 seconds). In that case, cron will run job processing, after 300 seconds queue:work will exit, and, in a couple of seconds, cron will run job processing again, and so on.

0
On

This is the solution that worked for me after searching for days.

flock -n /tmp/latavel_queues.lockfile /usr/bin/php /path/to/laravel/artisan queue:listen

See https://laracasts.com/discuss/channels/servers/database-queue-on-shared-hosting

1
On

One more solution (I solved same problem in this way). You can make script like this:

# getting running processes with name "queue:work"
QUEUE_STATUS=$(ps aux | grep "queue:work")

# check is queue:work started, if no, start it
if $( echo $QUEUE_STATUS | grep --quiet 'artisan queue:work')
then
    exit;
else
    php ~/public_html/artisan queue:work
fi

and run it in CRON. I run every 10 min.

1
On

Laravel >= 5.7

Adding this line to the scheduler in app\Console\Kernel.php and then setting up a cron job to run the scheduler, ensures that the queue gets serviced every minute.

$schedule->command('queue:work --stop-when-empty')
    ->everyMinute()
    ->withoutOverlapping();

For Cpanel servers, the Cron statement might look like:

* * * * * /usr/local/bin/php /home/{account_name}/live/artisan schedule:run

where {account_name} is the user account that cpanel is running under and live is the folder of the laravel application

Source: https://talltips.novate.co.uk/laravel/using-queues-on-shared-hosting-with-laravel

Laravel <= 5.6

--once To force worker to exit after running one job

--tries The maximum number of times a job may be attempted

--timeout The maximum number of seconds that jobs can run may be specified

withoutOverlapping() By default, scheduled tasks will be run even if the previous instance of the task is still running. To prevent this, you may use the withoutOverlapping method

sendOutputTo() Send the output to a file for later inspection

$schedule->command('queue:work --timeout=60 --tries=1 --once')
    ->everyMinute()
    ->withoutOverlapping()
    ->sendOutputTo(storage_path() . '/logs/queue-jobs.log');

Laravel All version

If you don't want to use the `--once' or '--max-time' options, it's better to use the restart command to avoiding memory leak and make apply code changes.

$schedule->command('queue:restart')
    ->everyFiveMinutes();

Laravel <= 5.2

Not recommended:

The --daemon options continued to run, which you can run once from the command line if you have ssh or cpanel 'command' access.