skip to Main Content

I am trying to upload a CSV file to my database in laravel. But my CSV file is pretty big, I almost have 500 million rows that I want to import. (I am using Maatwebsite to do this)

And when I try to import it I am getting:

Maximum execution time of 300 seconds exceeded

As you can see I already changed the "max_input_time" in the php.init file. 300 seconds would be enough because datagrip takes only 3 minutes. And even if it would take longer in laravel there has to be another way than increasing the "max_input_time"

this is the code that’s converting the data in a model and evantually putting it in de database:

public function model(array $row)
    {

        return new DutchPostalcode([
            'postalcode' => $row['PostcodeID'],
            'street' => $row['Straat'],
            'place' => $row['Plaats'],
            'government' => $row['Gemeente'],
            'province' => $row['Provincie'],
            'latitude' => $row['Latitude'],
            'longtitude' => $row['Longitude'],
        ]);

    }

this is my controller:

public function writeDutchPostalCodes(){
        Excel::import(new DutchPostalcodes, 'C:UsersMoemeDocumentsProjectsahmo appsAppsfreshnessFreshness - beFreshnessBEresourcespostalcodespostcodetabel_1.csv');
    }

2

Answers


  1. Use laravel queues.

    https://laravel.com/docs/9.x/queues

    For large processes you must do it at the background.

    Login or Signup to reply.
  2. Increase the max_execution_time in php.ini
    Or split the file for processing similar to array_chunk

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search