skip to Main Content

Here I am trying to import a csv file of 800k+ records, using Laravel excel .
I tried chunking with different values but even after several minutes of processing, error message pops up showing PostTooLargeException. Please guide me if anything’s wrong .

here is my code–

class UsersImport implements ToCollection, WithHeadingRow, WithChunkReading, ShouldQueue
{
/**
* @param array $row
*
* @return \Illuminate\Database\Eloquent\Model|null
*/
public function collection(Collection $rows)
{

        foreach ($rows as $row) {
           // some code here...
            ]);
        }
    }
    public function chunkSize(): int
    {
        return 1500;
    }
}

controller –

public function index(Request $request)
    {
        $file = $request->file('file')->store('imports');
        Excel::import(new UsersImport, $file);
    }

I am just trying to store chunks of data into jobs table for now , after that I will store them in another table . The code seems to work perfectly with smaller csv files but stucks in large file with which I want to work .

2

Answers


  1. 800K records can be a large amount of data (depending on the number of columns and their contents).

    The error reported by laravel clearly states that it throws a PostTooLargeException, so the post size has been exceeded. (the default value of post_max_size in a typical system can be just 8MB). For details you may refer to official documentation here:

    https://www.php.net/manual/en/ini.core.php

    You may amend the post_max_size in your php.ini. You may also wish to amend the upload_max_filesize (if you are using upload) and memory_limit.

    The following can be a reference (please use values which suit your needs):

    post_max_size = 1G
    upload_max_filesize = 1G
    memory_limit = 2G 
    

    Make sure you restart the httpd after amending the php.ini config file.

    Login or Signup to reply.
  2. The "PostTooLargeException" error message indicates that the file you are trying to upload exceeds the maximum file size allowed by the server. This is a common issue when uploading large files and can usually be resolved by increasing the maximum file size allowed by the server.

    Laravel allows you to set the maximum file size allowed in your php.ini or .htaccess file. You can increase the ‘upload_max_filesize‘ and ‘post_max_size‘ values ​​in the ‘php.ini‘ file to larger values ​​to allow larger files to be uploaded.

    However, even if you can increase the maximum file size allowed, processing large files can still take a lot of time and resources, and can lead to issues such as timeouts and memory exhaustion. there is. One way around these problems is to use chunking, which your code already does.

    The "chunkSize()" method of the "UsersImport" class returns the number of rows to process per chunk. They set this value to 1500 which is reasonable. However, if you continue to have problems, try reducing the chunk size even further for more efficient processing.

    Another way to handle large files is to use a queuing system like Laravel’s built-in queuing system or a third-party system such as RabbitMQ. A queuing system allows you to split file processing into smaller jobs that can be processed asynchronously, thus avoiding timeouts and resource exhaustion issues.

    In summary, to import large CSV files into Laravel you should consider the following steps:

    1. Increase the maximum file size allowed by the server in your "php.ini" or ".htaccess" file.
    2. Use chunking to divide the file processing into smaller chunks.
    3. If the problem persists, consider further reducing the chunk size.
    4. Use a queuing system to process files asynchronously to avoid timeout and resource exhaustion issues
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search