I had the need to serve very large files (1Gb+) from my webserver using PHP
and authentication, so that only some customer had the rights to download files.
I didn’t want to use direct links and using a PHP
script to serve the files wasn’t an option due to memory and timeout limits.
My webserver is managed with Plesk
and runs on Apache
/ CentOS
6.
2
Answers
The chosen solution was to install mod_xsendfile on the Centos server, configure it and send the files with PHP headers after the user authentication.
Installing mod_xsendfile on Centos. You need to enable EPEL repository to get xsendfile, after that a simple :
after that you can check if the module is correctly installed using:
The configuration/Plesk part. Plesk generates every httpd.conf file automatically, so modifying the files by hand is dangerous cause they can be overwritten by any change done via the Plesk UI. By default the http.conf file include a vhost.conf and vhost_ssl.conf, for virtual hosts the are stored in
you can either edit them via ssh or using the Plesk UI interface (vhost -> server settings -> additional apache directives). For non Plesk users just add the following directives under the correct vhost section of your httpd.conf
First time I had some problems cause the server was sending 0 bytes files, after some test I found the correct directives are :
Note, if I use a different Directory directive than "/" I get empty files and no errors at all on any log. Didn't test what happens by passing a relative path in the headers.
Cause I'm using Laravel as PHP framework, the file can then be sent to the user with :
Hope this helps someone avoid hours of test.
Dario’s comment above is really helpful. I just wanted to add what I used for the Laravel return statement in order to make sure the files are named correctly: