My implementation is laravel, but this could generally apply to php/mysql.
I have a large file, generally about 500MB that I want to store in the database. If it helps, the file itself is a zip of pdfs.
The code is straightforward:
$this->recordRequest->update([
'data' => file_get_contents($file),
]);
The error I get is that I’ve run out of memory. Allowed memory size of 134217728 bytes exhausted (tried to allocate 37456160 bytes).
What I’m looking for is something along these lines:
- A method to stream data from php to mysql
- A method to chunk the data for storage
Similarly, I’d need a way to present the data to the user, ideally by streaming the data from mysql. I haven’t tried it, but maybe I could use repeated SUBSTRING() calls on the blob.
Non-starters
- I realize that I could store the file in the filesystem or on S3 or somewhere else, but for now, I need to store it in the DB.
- I can’t increase the allowed memory size. My max_packet is about 15 MB, I can’t change that either.
2
Answers
How about splitting the zip file into several small files (because of max_allowed_packet) and loading them into the database with "LOAD_FILE(file_name)"?
https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_load-file
One way to do this is to create a chuck of data by files you will need 2 table one for the file and other for the file chunk data:
Models:
Now you can create a relationship between the models:
Create a new file and store the information by chunk:
Now if you want to read the file data you can do it and restore to a file again by: