I have jobs that run on multiple queue workers, that contain some HTTP requests using Guzzle. However, the try-catch block inside this job does not seem to pick up GuzzleHttpExceptionRequestException
when I am running these job in the background process. The running process is a php artisan queue:work
which is a Laravel queue system worker that monitors the queue and picks up the jobs.
Instead, the exception that is thrown is one of GuzzleHttpPromiseRejectionException
with the message:
The promise was rejected with reason: cURL error 28: Operation timed
out after 30001 milliseconds with 0 bytes received (see
https://curl.haxx.se/libcurl/c/libcurl-errors.html)
This is actually a disguised GuzzleHttpExceptionConnectException
(see https://github.com/guzzle/promises/blob/master/src/RejectionException.php#L22), because if I run a similar job in a regular PHP process that is triggered by visiting an URL, I do get the ConnectException
as intended with the message:
cURL error 28: Operation timed out after 100 milliseconds with 0 out
of 0 bytes received (see
https://curl.haxx.se/libcurl/c/libcurl-errors.html)
Sample code that would trigger this timeout:
try {
$c = new GuzzleHttpClient([
'timeout' => 0.1
]);
$response = (string) $c->get('https://example.com')->getBody();
} catch(GuzzleHttpExceptionRequestException $e) {
// This occasionally gets catched when a ConnectException (child) is thrown,
// but it doesnt happen with RejectionException because it is not a child
// of RequestException.
}
The code above throws either a RejectionException
or ConnectException
when ran in the worker process, but always a ConnectException
when tested manually through the browser (from what I can tell).
So basically what I derive, is that this RejectionException
is wrapping the message from the ConnectException
, however I am not using the asynchronous features of Guzzle. My requests are simply done in series. The only thing that differs is that multiple PHP processes might be making Guzzle HTTP calls or that the jobs itself are timing out (which should result in a different exception being Laravel’s IlluminateQueueMaxAttemptsExceededException
), but I dont see how this causes the code to behave differently.
I couldnt find any code inside the Guzzle packages that is using php_sapi_name()
/PHP_SAPI
(which determines the used interface) to execute different stuff when running from the CLI as opposed to a browser trigger.
tl;dr
Why does Guzzle throw me RejectionException
s on my worker processes, but ConnectException
s on regular PHP scripts triggered through browser?
Edit 1
Sadly I cannot create a minimal reproducible example. I see many error messages in my Sentry issue tracker, with the exact exception shown above. The source is stated as Starting Artisan command: horizon:work
(which is Laravel Horizon, it supervises the Laravel queues). I’ve checked again to see if there’s a discrepancy between PHP versions, but both the website and the worker processes run the same PHP 7.3.14
which is correct:
PHP 7.3.14-1+ubuntu18.04.1+deb.sury.org+1 (cli) (built: Jan 23 2020 13:59:16) ( NTS )
Copyright (c) 1997-2018 The PHP Group
Zend Engine v3.3.14, Copyright (c) 1998-2018 Zend Technologies
with Zend OPcache v7.3.14-1+ubuntu18.04.1+deb.sury.org+1, Copyright (c) 1999-2018, by Zend Technologies
- The cURL version is
cURL 7.58.0
. - Guzzle version is
guzzlehttp/guzzle 6.5.2
- Laravel version is
laravel/framework 6.12.0
Edit 2 (stack trace)
GuzzleHttpPromiseRejectionException: The promise was rejected with reason: cURL error 28: Operation timed out after 30000 milliseconds with 0 bytes received (see https://curl.haxx.se/libcurl/c/libcurl-errors.html)
#44 /vendor/guzzlehttp/promises/src/functions.php(112): GuzzleHttpPromiseexception_for
#43 /vendor/guzzlehttp/promises/src/Promise.php(75): GuzzleHttpPromisePromise::wait
#42 /vendor/guzzlehttp/guzzle/src/Client.php(183): GuzzleHttpClient::request
#41 /app/Bumpers/Client.php(333): AppBumpersClient::callRequest
#40 /app/Bumpers/Client.php(291): AppBumpersClient::callFunction
#39 /app/Bumpers/Client.php(232): AppBumpersClient::bumpThread
#38 /app/Models/Bumper.php(206): AppModelsBumper::post
#37 /app/Jobs/PostBumper.php(59): AppJobsPostBumper::handle
#36 [internal](0): call_user_func_array
#35 /vendor/laravel/framework/src/Illuminate/Container/BoundMethod.php(32): IlluminateContainerBoundMethod::IlluminateContainer{closure}
#34 /vendor/laravel/framework/src/Illuminate/Container/Util.php(36): IlluminateContainerUtil::unwrapIfClosure
#33 /vendor/laravel/framework/src/Illuminate/Container/BoundMethod.php(90): IlluminateContainerBoundMethod::callBoundMethod
#32 /vendor/laravel/framework/src/Illuminate/Container/BoundMethod.php(34): IlluminateContainerBoundMethod::call
#31 /vendor/laravel/framework/src/Illuminate/Container/Container.php(590): IlluminateContainerContainer::call
#30 /vendor/laravel/framework/src/Illuminate/Bus/Dispatcher.php(94): IlluminateBusDispatcher::IlluminateBus{closure}
#29 /vendor/laravel/framework/src/Illuminate/Pipeline/Pipeline.php(130): IlluminatePipelinePipeline::IlluminatePipeline{closure}
#28 /vendor/laravel/framework/src/Illuminate/Pipeline/Pipeline.php(105): IlluminatePipelinePipeline::then
#27 /vendor/laravel/framework/src/Illuminate/Bus/Dispatcher.php(98): IlluminateBusDispatcher::dispatchNow
#26 /vendor/laravel/framework/src/Illuminate/Queue/CallQueuedHandler.php(83): IlluminateQueueCallQueuedHandler::IlluminateQueue{closure}
#25 /vendor/laravel/framework/src/Illuminate/Pipeline/Pipeline.php(130): IlluminatePipelinePipeline::IlluminatePipeline{closure}
#24 /vendor/laravel/framework/src/Illuminate/Pipeline/Pipeline.php(105): IlluminatePipelinePipeline::then
#23 /vendor/laravel/framework/src/Illuminate/Queue/CallQueuedHandler.php(85): IlluminateQueueCallQueuedHandler::dispatchThroughMiddleware
#22 /vendor/laravel/framework/src/Illuminate/Queue/CallQueuedHandler.php(59): IlluminateQueueCallQueuedHandler::call
#21 /vendor/laravel/framework/src/Illuminate/Queue/Jobs/Job.php(88): IlluminateQueueJobsJob::fire
#20 /vendor/laravel/framework/src/Illuminate/Queue/Worker.php(354): IlluminateQueueWorker::process
#19 /vendor/laravel/framework/src/Illuminate/Queue/Worker.php(300): IlluminateQueueWorker::runJob
#18 /vendor/laravel/framework/src/Illuminate/Queue/Worker.php(134): IlluminateQueueWorker::daemon
#17 /vendor/laravel/framework/src/Illuminate/Queue/Console/WorkCommand.php(112): IlluminateQueueConsoleWorkCommand::runWorker
#16 /vendor/laravel/framework/src/Illuminate/Queue/Console/WorkCommand.php(96): IlluminateQueueConsoleWorkCommand::handle
#15 /vendor/laravel/horizon/src/Console/WorkCommand.php(46): LaravelHorizonConsoleWorkCommand::handle
#14 [internal](0): call_user_func_array
#13 /vendor/laravel/framework/src/Illuminate/Container/BoundMethod.php(32): IlluminateContainerBoundMethod::IlluminateContainer{closure}
#12 /vendor/laravel/framework/src/Illuminate/Container/Util.php(36): IlluminateContainerUtil::unwrapIfClosure
#11 /vendor/laravel/framework/src/Illuminate/Container/BoundMethod.php(90): IlluminateContainerBoundMethod::callBoundMethod
#10 /vendor/laravel/framework/src/Illuminate/Container/BoundMethod.php(34): IlluminateContainerBoundMethod::call
#9 /vendor/laravel/framework/src/Illuminate/Container/Container.php(590): IlluminateContainerContainer::call
#8 /vendor/laravel/framework/src/Illuminate/Console/Command.php(201): IlluminateConsoleCommand::execute
#7 /vendor/symfony/console/Command/Command.php(255): SymfonyComponentConsoleCommandCommand::run
#6 /vendor/laravel/framework/src/Illuminate/Console/Command.php(188): IlluminateConsoleCommand::run
#5 /vendor/symfony/console/Application.php(1012): SymfonyComponentConsoleApplication::doRunCommand
#4 /vendor/symfony/console/Application.php(272): SymfonyComponentConsoleApplication::doRun
#3 /vendor/symfony/console/Application.php(148): SymfonyComponentConsoleApplication::run
#2 /vendor/laravel/framework/src/Illuminate/Console/Application.php(93): IlluminateConsoleApplication::run
#1 /vendor/laravel/framework/src/Illuminate/Foundation/Console/Kernel.php(131): IlluminateFoundationConsoleKernel::handle
#0 /artisan(37): null
The Client::callRequest()
function contains simply a Guzzle Client on which I call $client->request($request['method'], $request['url'], $request['options']);
(so im not using requestAsync()
). I think it has something to do with running jobs in parallel that causes this issue.
Edit 3 (solution found)
Consider the following testcase which makes an HTTP request (which should return a regular 200 response):
try {
$c = new GuzzleHttpClient([
'base_uri' => 'https://example.com'
]);
$handler = $c->getConfig('handler');
$handler->push(GuzzleHttpMiddleware::mapResponse(function(ResponseInterface $response) {
// Create a fake connection exception:
$e = new GuzzleHttpExceptionConnectException('abc', new GuzzleHttpPsr7Request('GET', 'https://example.com/2'));
// These 2 lines both cascade as `ConnectException`:
throw $e;
return GuzzleHttpPromiserejection_for($e);
// This line cascades as a `RejectionException`:
return GuzzleHttpPromiserejection_for($e->getMessage());
}));
$c->get('');
} catch(Exception $e) {
var_dump($e);
}
Now what I originally did was call rejection_for($e->getMessage())
which creates its own RejectionException
based on the message string. Calling rejection_for($e)
was the correct solution here. Only thing left to answer is if this rejection_for
function is the same as a simple throw $e
.
5
Answers
Hello I would like to know if you are having error 4xx or error 5xx
But even so I will put some alternatives for solutions found that resemble your problem
alternative 1
I’d like to bump this, I had this issue with a new production server returning unexpected 400 responses compared to the development and test environment working as expected; simply installing apt install php7.0-curl fixed it.
It was a brand new Ubuntu 16.04 LTS install with php installed via ppa:ondrej/php, during debugging I noticed that the headers were different. Both were sending a multi-part form with chucked data, however without php7.0-curl it was sending a Connection: close header rather than the Expect: 100-Continue; both requests of which had Transfer-Encoding: chunked.
alternative 2
Maybe you should try this
Guzzle need cactching if the response code not 200
alternative 3
In my case it was because I had passed an empty array in the request’s $options[‘json’]
I couldn’t reproduce the 500 on the server using Postman or cURL even when passing the Content-Type: application/json request header.
Anyway, removing the json key from the request’s options array solved the problem.
I spent like 30 mins trying to figure out what’s wrong because this behavior is very inconsistent. For all other request I’m making, passing $options[‘json’] = [] didn’t cause any issues. It could be a server issue tho, I don’t control the server.
send feedback on details obtained
Hello I didn’t understand if you ended up solving your problem or not.
Well I would like you to post what is the error log.
Search both in PHP and within your server’s error log
I await your Feedback
As this happens sporadically on your environment and it’s hard to replicate throwing the
RejectionException
(at least I could not), can you just add anothercatch
block to your code, see below:It must give you and us some ideas on why and when this happens.
Guzzle uses Promises for both synchronous and asynchronous requests. The only difference is that when you use synchronous request (your case) – it is fulfilled right away by calling a
wait()
method. Note this part:So, it throws
RequestException
which is an instance ofException
and it always happens on 4xx and 5xx HTTP errors, unless throwing exceptions is disabled via options. As you see, it may also throw aRejectionException
if the reason is not an instance ofException
e.g. if the reason is a string which seems happens in your case. The weird thing is that you getRejectException
rather thanRequestException
as Guzzle throwsConnectException
on connection timeout error. Anyway, you may find a reason if you go through yourRejectException
stack trace in Sentry and find where thereject()
method is called on Promise.Discussion with the author inside the comment section as a starter to my answer:
Question:
Answer of the author:
According to this here is my thesis:
You have a timeout inside one of your middleware’s which calls guzzle. So let’s try to implement a reproducible case.
Here we have a custom middleware which calls guzzle and returns a rejection failure with the exception message of the sub-call. It’s pretty tricky, because due to the internal error-handling it get’s invisible inside the stack-trace.
This is an test example how you can use it:
As soon as I perform a test against this I’m receving
So it looks like your main guzzle call failed but in reality it’s the sub-call which failed.
Let me know if this helps you to identify your specific issue. I would also much appreciate if you can share your middlewares in order to debug this a little bit further.