i am trying to import an excel file in my database, Laravel/mysql. I am using php spreadsheet library and trying to do my task with chunks because excel files can be too large.
Problem is that, chunks are not working properly, i want chunks of 300 lines, but it picks first chunk and then add new chunk in previous, and ultimately gives the incorrent chunk results in lot of duplicates and garbage data in my database.
here is my code.
$totalRows=1051;
$currentRows=1;
while($currentRows <= $totalRows){
$reader = new PhpOfficePhpSpreadsheetReaderXlsx();
$reader->setReadDataOnly(true);
$chunkFilter = new ChunkReadFilterService();
$reader->setReadFilter($chunkFilter);
$chunkFilter->setRows($currentRows,300);
$spreadsheet = $reader->load($path);
$sheetData = [];
$sheetData = $spreadsheet->getActiveSheet()->toArray(null, true, true,
true);
$currentRows += 300;
echo count($sheetData);
echo '</br>';
}
it gives the result
300
600
900
1051
while, it should echo 300 every time, because chunkSize is 300 and it should pick the 300 records every time, only last time should be 151.
Any help or suggenstion would be appreciated. thanks.
2
Answers
try ChunkReadFilter
https://phpspreadsheet.readthedocs.io/en/latest/topics/reading-files/#combining-read-filters-with-the-setsheetindex-method-to-split-a-large-csv-file-across-multiple-worksheets
example:
You can try another library FastExcelLaravel, it’s easier to use.
The code below reads the sheet row by row, and can read a huge files