skip to Main Content

I have several bash scripts being triggered by crontab at the same times throughout the day. I was looking for a way to log the STDOUT as well as the STDERR to the same file, so that I have something like a Single Source of Truth to see which of the scripts got executed successfully and what was the error in those that didn’t.

OS: Ubuntu 22.04.2 LTS

Currently, each script file is somewhat like this.

    #!/bin/bash

    exec > >(tee -ia ~/path/to/custom/log/file) 2>&1
    set -e # Stop execution if any exception
    echo `date` " " `pwd`

   ... rest of the code ...

I am using tee so that in case we run the script manually, the output/error would be displayed on the screen and we would not have to sort through the log file to find the desired output.

The Problem:
Though the logging is happening, since the programs are triggered at the same time and each program prints certain statements at different points of time, the log file is currently a jumble of all outputs produced by the programs. I was looking for a way to output all the statements of each program collectively after the program is executed entirely so that the log file can be more readable.

I cannot use /var/log/syslog (using the handy-dandy logger command of UNIX) due to ownership issues.

I would be happy to provide additional information if required.

P.S. I understand that exec is changing the file-descriptors. I have also tried subshelling the whole code and using just tee, but to no avail.

2

Answers


  1. Chosen as BEST ANSWER

    This is how I managed it. I am aware it is a hacky solution, but it does the job, which is why I am not marking my answer as accepted. I invite better solutions to the issue.

    The rationale is: use a script-specific file descriptor to act as a make-shift buffer for the STDOUT and STDERR. On termination of the program, append the contents to the generic log file for all scripts and clear the script-specific log file.

        #!/bin/bash
    
        $log_location="/path/to/script/specific/log/file/"
        $station="name-of-code"
    
        # Define a function to transfer script-specific log to a general log
        handle_logs() {
            cat $log_location/runlog_$station.log >> ~/path/to/custom/log/file
            > $log_location/runlog_$station.log
        }
    
        # Set up a trap to call the handle_logs function when the script exits
        trap 'handle_logs' EXIT
    
        exec &> >(tee -ia ~/path/to/custom/log/file)
        set -e # Stop execution if any exception
        echo "$(date) $PWD"
        
        ... rest of the code ...
    

  2. What I would do:

    #!/bin/bash
    
    exec &> >(tee -ia ~/path/to/custom/log/file.$$)
    set -e # Stop execution if any exception
    echo "$(date) $PWD"
    

    Then when you want to retrieve these logs:

    cat ~/path/to/custom/log/file.* | less
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search