How to insert a million records into the database so that the system does not hang, and then twist the message that the page does not respond
$rangeNumber = collect(range($validatedDate['from'], $validatedDate['to']));
$collectData = [];
foreach ($rangeNumber as $number){
$collectData [] = [
'receive_uuid' => $receiveUUID,
'region_id' => $regionId,
'office_id' => $officeId,
'number' => $prefix.str_pad($number, 7, '0', STR_PAD_LEFT),
'description' => '',
'created_at' => Carbon::now(),
];
}
$dataCollection = collect($collectData)->chunk(10000);
foreach ($dataCollection as $key => $data){
DB::table('blanks')->insert($data->toArray());
}
session()->flash('MsgSuccess', 'Success');
return $this->redirect(route('blanks'), true);
3
Answers
You could use Laravel jobs, to handle all this in the backgrounds, so that you don’t have to wait for the records to be insterted. Simple example:
ProcessRecords class should look something like this:
Then, you can just dispatch this job from your controller:
Likely, this isn’t a job to be done procedurally. Ideally, you’d want to run these types of jobs in the background using Jobs (https://laravel.com/docs/10.x/queues#creating-jobs).
This will run on a queue like Horizon (https://laravel.com/docs/10.x/horizon) and process inserts in the background, consuming less memory and not making users wait for these inserts to process.
We can create one with:
To produce a template that can do this processing in the background:
I would recommend one job PER chunk, allowing them to be processed in parallel and saving memory in the process. Ideally, to abstract this further we would create a parent job which performs data transformation as shown in your snippet and spawns child jobs for each generated chunk.
You can learn more about dispatching jobs here (https://laravel.com/docs/10.x/queues#dispatching-jobs)
Beside the Queue jobs
What I would suggest is a PHP generator, Laravel already has LazyCollection
TIP:
I would also recommend you to use Laravel Benchmark
or the old way