skip to Main Content

I have been stuck for a while to figure out about the below exception any help would be apricated.
we are using AWS Lambda service on a Laravel project. and we are exporting a large number of data to CSV files using Laravel Excel and we are doing this through Laravel SQS queues.

php version: 7.2
Laravel Framework: 7.30.1
Laravel Excel : 3.1

Exception:

AwsSqsExceptionSqsException /tmp/vendor/aws/aws-sdk-php...
stage.ERROR: Error executing "SendMessage" on "https://sqs.eu-central- 
1.amazonaws.com"; AWS HTTP error: Client error: `POST https://sqs.eu- 
central-1.amazonaws.com/` resulted in a `413 Request Entity Too Large` 
response:
HTTP content length exceeded 1662976 bytes.
Unable to parse error information from response - Error parsing XML: String could not be parsed as 
XML {"exception":"[object] (Aws\Sqs\Exception\SqsException(code: 0): Error executing      
"SendMessage" on "https://sqs.eu-central-1.amazonaws.com/"; AWS 
HTTP error: Client error: `POST https://sqs.eu-central-1.amazonaws.com/` resulted in a `413 Request Entity Too Large` response:
HTTP content length exceeded 1662976 bytes.
Unable to parse error information from response - Error parsing XML: String could not be parsed as 
XML at /tmp/vendor/aws/aws-sdk-php/src/WrappedHttpHandler.php:195)
[stacktrace]

3

Answers


  1. Chosen as BEST ANSWER

    Scenario
    From the AWS docs hard limit to send data to SQS queue is 256KB. that was happening because I was sending Collection object to the SQS Queue, and each object had Multiple Eager Loading Relationships so the object size became too large.

    AWS Solution
    Passing ids in array of collection object to SQS Queues helped to solve Entity Too Large problem.

    Excel solution
    Instead of using Laravel Excel I am using Laravel Storage append method to add rows to my csv file. I have tested with 50k rows and it works very well. Do not process all rows at once you may get yourself into AWS timeout exception, which is 15 minutes per queue I believe.


  2. From the docs:

    Message size
    The minimum message size is 1 byte (1 character). The maximum is 262,144 bytes (256 KB).

    My guess is that the files you’re trying to send are larger than 256KB, which is a hard limit for SQS messages in AWS.

    The output HTTP content length exceeded 1662976 bytes suggests my guess might be correct.

    The common pattern in this case is to upload the files to S3 and send a reference to the object through the Queue.

    Login or Signup to reply.
  3. I got this error when running csv imports on laravel vapor with the Laravel Excel library. I was able to fix it without rewriting the file processing.

    The simple solution for me was to change implements ShouldQueue, WithChunkReading to implements ShouldQueueWithoutChain, WithChunkReading on my importable class.

    e.g.

    
        <?php
        
        namespace AppImports;
        use MaatwebsiteExcelConcernsWithChunkReading;
        use MaatwebsiteExcelConcernsShouldQueueWithoutChain;
        
        class BaseDataImporter implements ShouldQueueWithoutChain, WithChunkReading {
        ...
    
    

    Here’s whats going on under the hood: When Laravel Excel sees "WithChunkReading" it serializes all the chunks as a single "chained" job. If your csv file has enough rows, the length of this chain can exceed 256kb. Using ShouldQueueWithoutChain changes this logic to create a small, easily queueable job for each chunk.

    This error is hard to catch if you are testing locally with a database queue driver. Laravel Jobs can be something like 4GB. You can approximate this limitation by adding a migration to change your jobs.payload column from LONGTEXT to TEXT. This will set the job size limit to 64KB, which is well below the SQS limit, but will be an effective canary for catching chained jobs that have way too many chunks for AWS ( assuming you don’t anticipate needing to queue extremely large jobs ).

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search