I’m in the process of migrating a website offering large downloadable files (4GB and up) into Azure. I have successfully created the connection to the Azure storage account so I can retrieve the files. However the download a) takes ages to start and b) consumes loads of memory since I read the entire file into memory before actually sending the downloadable data to the client.
Is there a way to read the file in chunks from the Storage account to avoid the delay and memory consumption?
The current code looks like this:
$getBlobResult = $blobClient->getBlob($containerName,$FileName);
$blobProperties=($getBlobResult->getProperties());
$FileSize=$blobProperties->getContentLength();
// Send headers
header ("Content-type: application/octet-stream");
header ("Content-Length: $FileSize");
header ("Content-Disposition: attachment; filename="$FileName"");
// Send content of file
echo stream_get_contents($getBlobResult->getContentStream());
3
Answers
For many reasons, I would recommend just serving the download directly from the storage. Mostly, because you have a big I/O performance impact and your traffic is also heavily loaded.
Instead of using PHP I would suggest the following solutions – depending on what your hosting environment allows you to do.
azcopy
, rename it to some<hash>.<ext>
and serve the direct URL to the user.You can certainly do that. When downloading blob using
getBlob
method, you will need to specify the byte range you wish to download in theoptions
parameter which is of typeGetBlobOptions
.If I am not mistaken, you would need to define the range in
start-end
format (e.g.0-1023
to download 1KB)Serving the file to the user via a signed URL as @CodeBrauer suggests would be the most preferable solution since you don’t have to pipe the data through your code/infrastructure at all.
Though if that is at odds with your requirements, then you can try replacing:
With:
Which should just dump the stream straight out without needing virtually any buffer.
https://www.php.net/manual/en/function.fpassthru