skip to Main Content

I have application in laravel that send several email and some of this email have to wait some time to be sent.

So I’m using the queue database type and in localhost I run the command php artisan schedule:run that runs this command:

$schedule->command('queue:work')->everyMinute();

and works perfectly.

Now I pass the project to a cpanel shared hosting and to run the schedule command I create a cron job that do that.

/usr/local/bin/php /path to project/artisan schedule:run

As I need to be always watching if I need to send an email I define run a cron job each minute and works in first 5 or 10 minutes.

Next I start to receive a 503 error from server because I arrive to the lime of processes probably because the cron job execution. And right now the server will be down for 24hours.

How can I solve that? What is the better solution for this?

Thank you

2

Answers


  1. I use shared hosting and had a similar issue. If your hosting service accepts the php command shell_exec() you could do this.

    protected function schedule(Schedule $schedule)
        {
            if (!strstr(shell_exec('ps xf'), 'php artisan queue:work'))
            {
                $schedule->command('queue:work --timeout=60 --tries=1')->everyMinute();
            }
        }
    

    Your cron job seems ok. By the way, if your hosting server is 24h down, you may consider another host my friend.

    queue:work is a long running process. This check ensures it’s running on your server. It will listens to your queue and does the job. It also means that if you make changes to your production files, the worker will not pick the changes up. Have a look at my top -ac

        PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
    2398733 user  20   0  466m  33m  12m S  0.0  0.1   0:03.15 /opt/alt/php72/usr/bin/php artisan queue:work --timeout=60 --tries=1
    2397359 user  20   0  464m  33m  12m S  0.0  0.1   0:03.04 /usr/local/bin/php /home/user/booklet/artisan schedule:run
    2398732 user  20   0  105m 1308 1136 S  0.0  0.0   0:00.00 sh -c '/opt/alt/php72/usr/bin/php' 'artisan' queue:work --timeout=60 --tries=1 >> '/home/user/booklet/storage/queue.log' 2>&1
    

    As you can see, the worker is on top, another process simply writes everything it does to a log file. You have to kill 2398733 after making new uploads/changes to your prod server. The process will restart by itself in less than 5 minutes. Because of the schedule:run cron job.

    Update October 2019

        protected function schedule(Schedule $schedule)
        {
            if (!strstr(shell_exec('ps xf'), 'php artisan queue:work'))
            {
                $schedule->command('queue:work --timeout=60 --tries=1')->withoutOverlapping();
            }
        }
    

    The ->withoutOverlapping() method pushes the process command in the background. It ensures that the artisan Schedule command exits properly.

    Login or Signup to reply.
  2. You can prevent this from happening with withoutOverlapping on the cron task.

    By default, scheduled tasks will be run even if the previous instance of the task is still running. To prevent this, you may use the withoutOverlapping method:

    $schedule->command('emails:send')->withoutOverlapping();
    

    https://laravel.com/docs/5.7/scheduling#preventing-task-overlaps

    This way, your cron will restart the queue:work task if it fails for some reason, but it won’t fire up multiple instances of it.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search