skip to Main Content

My implementation is laravel, but this could generally apply to php/mysql.

I have a large file, generally about 500MB that I want to store in the database. If it helps, the file itself is a zip of pdfs.

The code is straightforward:

    $this->recordRequest->update([
      'data' => file_get_contents($file),
    ]);

The error I get is that I’ve run out of memory. Allowed memory size of 134217728 bytes exhausted (tried to allocate 37456160 bytes).

What I’m looking for is something along these lines:

  1. A method to stream data from php to mysql
  2. A method to chunk the data for storage

Similarly, I’d need a way to present the data to the user, ideally by streaming the data from mysql. I haven’t tried it, but maybe I could use repeated SUBSTRING() calls on the blob.

Non-starters

  1. I realize that I could store the file in the filesystem or on S3 or somewhere else, but for now, I need to store it in the DB.
  2. I can’t increase the allowed memory size. My max_packet is about 15 MB, I can’t change that either.

2

Answers


  1. How about splitting the zip file into several small files (because of max_allowed_packet) and loading them into the database with "LOAD_FILE(file_name)"?

    https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_load-file

    Login or Signup to reply.
  2. One way to do this is to create a chuck of data by files you will need 2 table one for the file and other for the file chunk data:
    Models:

    • File [id, name]
    • File Data [id, file_id, chunk_data]

    Now you can create a relationship between the models:

    class File extends Model {
    
     public function fileData () {
       return $this->many(FileData:class);
     }
    }
    

    Create a new file and store the information by chunk:

    $fileModel = new File(basename($file));
    
    $handle = fopen($file, 'rb'); // Open the file in binary mode for reading
    if ($handle === false) {
        // Handle error opening the file
    }
    
    while (!feof($handle)) {
        $chunk = fread($handle, 4096); // Read 4KB chunk from the file
        // Process the chunk as needed
        // For example, you can write the chunk to another file, update the database, etc.
       $file->fileData([
    'chunk_data' => $chunk
    ])->save();
    }
    
    fclose($handle); // Close the file handle
    

    Now if you want to read the file data you can do it and restore to a file again by:

    // Assume $file is the model instance representing the file stored in the database
    $file = File::find($id); // Assuming $id is the ID of the file you want to read
    
    // Open a new file for writing
    $destinationPath = storage_path('app/'); // Destination directory for the new file
    $fileName = $file->name; // Assume 'name' is the column containing the file name
    $destinationFile = $destinationPath . $fileName;
    $handle = fopen($destinationFile, 'wb'); // Open the file in binary mode for writing
    
    if ($handle === false) {
        // Handle error opening the file
    }
    
    // Iterate over the collection of chunks and write each chunk to the destination file
    foreach ($file->fileData as $chunk) {
        fwrite($handle, $chunk->chunk_data);
    }
    
    fclose($handle); // Close the file handle
    
    echo "File has been written to: " . $destinationFile;
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search