Since I want to realize downloading large size file (> 4GB) from my ASP.NET Core backend, many articles point out that HttpResponse.TransmitFile
in the .NET Framework could achieve my goal.
However, it seems that HttpResponse.TransmitFile
is no longer available in .NET Core.
Does anyone know what the alternative to HttpResponse.TransmitFile
is in .NET Core? I can’t tell you how much I appreciate your relevant answers.
2
Answers
You can use below sample to implement the requirement. For more details, you can check the blogs Streaming Zip on ASP.NET Core.
For zip:
This solution has all the same advantages of our previous non-Core solution:
I suspect the real question isn’t finding the alternative to
TransmitFile
(it’sreturn File(path)
orreturn File(stream)
but handling request ranges so clients can download large files in chunks that can be retried if interrupted.Luckily, this is already supported by both the ControllerBase.File method available since ASP.NET Core 2.1 and the Results.File method used in Minimal APIs (among others). Range processing is off by default but can be enabled by passing
true
to theenableRangeProcessing
parameter, for example :Even better, the Static Files provider also supports ranges (and response compression) out of the box. If the large files are in a specific folder, you could serve them with :
If you want to use response compression in your own actions you’ll have to either enable it on the web server or explicitly through the response compression middleware:
From that point on, it’s up to the client to retrieve specific chunks and retry them. Download utilities typically download large files in chunks and retry failed parts automatically. Khalid Abuhakmeh describes the process and how it works with ASP.NET Core in a short blog post.
In C#, HttpClient can request specific chunks of the file and even download them concurrently using the
Range
header, eg :If you have a list of ranges, you can use it to download the remote file in parallel and combine the chunks later: