I’m a PHP beginner and I’m trying to build a gallery page that will output thumbnails for all 195 images in a folder. These images average 8 MB each. Here’s the current situation:
php.ini in the script folder
allow_url_fopen = On
display_errors = On
enable_dl = Off
file_uploads = On
max_execution_time = 999
max_input_time = 999
max_input_vars = 1000
memory_limit = 999M
post_max_size = 516M
session.gc_maxlifetime = 1440
session.save_path = "/var/cpanel/php/sessions/ea-php74"
upload_max_filesize = 512M
zlib.output_compression = Off
PHP / HTML code
<?php
DEFINE('UPLOAD_DIR', 'sources/');
DEFINE('THUMB_DIR', 'thumbs/');
function GenerateThumbnail($src, $dest)
{
$Imagick = new Imagick($src);
$bigWidth = $Imagick->getImageWidth();
$bigHeight = $Imagick->getImageHeight();
$scalingFactor = 230 / $bigWidth;
$newheight = $bigHeight * $scalingFactor;
$Imagick->thumbnailImage(230,$newheight,true,true);
$Imagick->writeImage($dest);
$Imagick->clear();
return true;
}
// Get list of files in upload dir
$arrImageFiles = scandir(UPLOAD_DIR);
// Remove non-images
$key = array_search('.', $arrImageFiles);
if ($key !== false)
unset($arrImageFiles[$key]);
$key = array_search('..', $arrImageFiles);
if ($key !== false)
unset($arrImageFiles[$key]);
$key = array_search('.ftpquota', $arrImageFiles);
if ($key !== false)
unset($arrImageFiles[$key]);
$key = array_search('thumbs', $arrImageFiles);
if ($key !== false)
unset($arrImageFiles[$key]);
?><!DOCTYPE HTML>
<html lang="fr">
<head>
<meta http-equiv="content-type" content="text/html; charset=utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>Select Image</title>
</head>
<body>
<?php
foreach($arrImageFiles as $imageFile)
{
$thumbFullPath = THUMB_DIR . "th_" . $imageFile;
$imageFullPath = UPLOAD_DIR . $imageFile;
if (! file_exists($thumbFullPath))
{
GenerateThumbnail($imageFullPath, $thumbFullPath);
}
echo "<img alt='' src='" . $thumbFullPath . "'>";
}
?>
</body>
</html>
The two issues I don’t know how to fix are:
Here’s what I’ve already tried:
Thanks for any ideas, I’m a bit lost.
2
Answers
The issue is definitely either a timeout or resource scarcity (memory or something like that). This article shows ImageMagick causing 315.03 MB memory usage per image and 47% CPU usage. That can compound/leak if you’re cycling through images in the processor in a
foreach
loop. Image transformations can be expensive on server resources. It might also be a problem with one of your image files or some other oddity.Ajax it
A solution to any of these problems is to process these in small batches. Create a simple ajax/php loop – using an html.php page for the ajax and a separate file for the processing, like this:
thumb-display.php
thumb-process.php
.. and just sit back and watch your console. If you get a timeout or other error, you’ll see the
nextIndex
it choked on, and you can reload your page with that number as a starting index instead of 0.Of course you could gather all the file paths in an array in thumb-display.php and send each filepath through ajax (so as not to have to recount all files in that path each time), but personally I feel better about sending a number through post rather than an image path. Let me know if you’d rather have a big array of filepaths to send instead of an index#.
@kinglish answer is good to load them via ajax, but I don’t think it’s the right approach. This alleviates some processing from the server as it doesn’t run them all at once, but it’s still not a good solution if you ever want many users accessing the 195 images
The biggest problem is here is that you are trying to process images on request, so every single page load will request and recreate all 195 images. This is bad for performance, and chances are 10 users on your site will crash your server just by refreshing the page a few times unless you’re paying more than $10/mo for it.
So, here’s a better solution.
Run your image processing script server side only on a loop of the 198 images.
You can look up how to run a command to execute a php file, or just do "php yourfile.php" if you’re SSH’d into a linux machine.
The script will run and process the images faster than when you do it via a browser request. [don’t know the details, it’s just that’s how it works]
Store the images to a "thumbnail" directory and then just load them like you normally would via tags.
TADA problem solved. You only process them once this way, the rest of the time you only serve them to the client.
Yes, you now have both raw size and thumbnail versions, which feels wrong, but performs way way better than trying to regenerate them on every request.
Add some caching [load them to S3, or use cloudflare to cache them] and your site will be as fast as any.