I have several bash scripts being triggered by crontab at the same times throughout the day. I was looking for a way to log the STDOUT as well as the STDERR to the same file, so that I have something like a Single Source of Truth to see which of the scripts got executed successfully and what was the error in those that didn’t.
OS: Ubuntu 22.04.2 LTS
Currently, each script file is somewhat like this.
#!/bin/bash
exec > >(tee -ia ~/path/to/custom/log/file) 2>&1
set -e # Stop execution if any exception
echo `date` " " `pwd`
... rest of the code ...
I am using tee so that in case we run the script manually, the output/error would be displayed on the screen and we would not have to sort through the log file to find the desired output.
The Problem:
Though the logging is happening, since the programs are triggered at the same time and each program prints certain statements at different points of time, the log file is currently a jumble of all outputs produced by the programs. I was looking for a way to output all the statements of each program collectively after the program is executed entirely so that the log file can be more readable.
I cannot use /var/log/syslog (using the handy-dandy logger command of UNIX) due to ownership issues.
I would be happy to provide additional information if required.
P.S. I understand that exec
is changing the file-descriptors. I have also tried subshelling the whole code and using just tee
, but to no avail.
2
Answers
This is how I managed it. I am aware it is a hacky solution, but it does the job, which is why I am not marking my answer as accepted. I invite better solutions to the issue.
The rationale is: use a script-specific file descriptor to act as a make-shift buffer for the STDOUT and STDERR. On termination of the program, append the contents to the generic log file for all scripts and clear the script-specific log file.
What I would do:
Then when you want to retrieve these logs: