Running cPanel on a server with various customer accounts under the /home
directory.
Many customers’ error_log
files are exceeding a desired size (let’s say 100MB) and I want to create a cron job to run daily to truncate any files over a certain size.
I know truncate can shrink files but it will extend files if they’re smaller than the stipulated amount, so does my solution below (of first finding all files above the desired size and only shrinking those) make the most sense and will it work?
for i in $(find /home -type f -iname error_log -size +99M); do
truncate -s 100M $i
done
2
Answers
for i in $(...)
. It will break on whitespaces."$i"
.find
has-exec
, just use it.So:
I’d suggest rotating and compressing logs rather than truncating them. Logs typically compress really well, and you can move the compressed logs to backup media if you like. Plus, if you do have to delete anything, delete the oldest logs, not the newest ones.
That said, for educational purposes let’s explore
truncate
. It has the ability to only shrink files, though it’s buried in the documentation:If the files are at a fixed depth you don’t need the loop nor the
find
call. A simple glob will do:If they’re at unpredictable depths you can use extended globbing…
…or use
find -exec <cmd> {} +
, which tellsfind
to invoke a command on the files it finds.(If there are lots and lots of files
find
is safest. The glob options could exceed Linux’s command-line length limit whereasfind
guards against that possibility.)