skip to Main Content

i am trying to import an excel file in my database, Laravel/mysql. I am using php spreadsheet library and trying to do my task with chunks because excel files can be too large.
Problem is that, chunks are not working properly, i want chunks of 300 lines, but it picks first chunk and then add new chunk in previous, and ultimately gives the incorrent chunk results in lot of duplicates and garbage data in my database.
here is my code.

$totalRows=1051;
$currentRows=1;
    while($currentRows <= $totalRows){
                    
                $reader = new PhpOfficePhpSpreadsheetReaderXlsx();
                $reader->setReadDataOnly(true);

                $chunkFilter = new ChunkReadFilterService();
                $reader->setReadFilter($chunkFilter);
                $chunkFilter->setRows($currentRows,300);
              
                $spreadsheet = $reader->load($path);
                
                $sheetData = [];
               
                $sheetData   = $spreadsheet->getActiveSheet()->toArray(null, true, true, 
                     true);
                  $currentRows += 300;
                
                echo count($sheetData);
                echo '</br>';
               
                
               
                
               
                
            }

it gives the result

    300
600
900
1051

while, it should echo 300 every time, because chunkSize is 300 and it should pick the 300 records every time, only last time should be 151.
Any help or suggenstion would be appreciated. thanks.

2

Answers


  1. try ChunkReadFilter

    https://phpspreadsheet.readthedocs.io/en/latest/topics/reading-files/#combining-read-filters-with-the-setsheetindex-method-to-split-a-large-csv-file-across-multiple-worksheets

    example:

    $inputFileType = 'Csv';
    $inputFileName = './sampleData/example2.csv';
    
    echo 'Loading file ',pathinfo($inputFileName,PATHINFO_BASENAME),' using IOFactory with a defined reader type of ',$inputFileType,'<br />';
    /**  Create a new Reader of the type defined in $inputFileType  **/
    $reader = PhpOfficePhpSpreadsheetIOFactory::createReader($inputFileType);
    
    /**  Define how many rows we want to read for each "chunk"  **/
    $chunkSize = 65530;
    /**  Create a new Instance of our Read Filter  **/
    $chunkFilter = new ChunkReadFilter();
    
    /**  Tell the Reader that we want to use the Read Filter  **/
    /**    and that we want to store it in contiguous rows/columns  **/
    
    $reader->setReadFilter($chunkFilter)
        ->setContiguous(true);
    
    /**  Instantiate a new Spreadsheet object manually  **/
    $spreadsheet = new PhpOfficePhpSpreadsheetSpreadsheet();
    
    /**  Set a sheet index  **/
    $sheet = 0;
    /**  Loop to read our worksheet in "chunk size" blocks  **/
    /**  $startRow is set to 2 initially because we always read the headings in row #1  **/
    for ($startRow = 2; $startRow <= 1000000; $startRow += $chunkSize) {
        /**  Tell the Read Filter which rows we want to read this loop  **/
        $chunkFilter->setRows($startRow,$chunkSize);
    
        /**  Increment the worksheet index pointer for the Reader  **/
        $reader->setSheetIndex($sheet);
        /**  Load only the rows that match our filter into a new worksheet  **/
        $reader->loadIntoExisting($inputFileName,$spreadsheet);
        /**  Set the worksheet title for the sheet that we've justloaded)  **/
        /**    and increment the sheet index as well  **/
        $spreadsheet->getActiveSheet()->setTitle('Country Data #'.(++$sheet));
    }
    
    Login or Signup to reply.
  2. You can try another library FastExcelLaravel, it’s easier to use.
    The code below reads the sheet row by row, and can read a huge files

        $excel = Excel::open($file);
        $sheet = $excel->getSheet('SheetName');
        foreach ($sheet->nextRow() as $rowNum => $rowData) {
            // $rowData is array ['A' => ..., 'B' => ...]
            // handling of $rowData here
        }
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search