skip to Main Content

We are in the process of setting up a process to read log files and generate long-running query reports in PostgreSQL. However, we are looking for a configuration setting that allows us to track only stored procedure or function calls, excluding unnecessary DDL creation logs. The current configuration settings we have are:

log_statement = All
log_min_duration_statement = 1000

While these settings provide the desired log data, they generate a significant amount of logs impacting performance and storage. Is there a specific configuration setting or approach that can help us reduce the log files and retain only the necessary data for stored procedures and functions?

We appreciate any insights or recommendations on optimizing PostgreSQL logging for our use case.

2

Answers


  1. Certainly! To optimize PostgreSQL logging for your use case and focus specifically on stored procedures and functions, you can make use of the log_statement and log_duration configuration settings. However, it’s important to note that PostgreSQL does not provide a built-in configuration option to log only stored procedures or functions. But, you can achieve this by combining multiple settings.

    Here’s an approach to achieve your goal:

    Set log_statement to ‘ddl’ for DDL statements:

    log_statement = 'ddl'
    

    This setting will log only DDL (Data Definition Language) statements like CREATE, ALTER, and DROP, excluding SELECT, INSERT, UPDATE, DELETE statements.

    Set log_duration for stored procedures and functions:

    log_duration = on
    log_min_duration_statement = 1000
    

    This will log the duration of statements that take longer than the specified threshold (in milliseconds). Adjust the log_min_duration_statement as needed.

    With these settings, you’ll capture the duration of stored procedures and functions while excluding unnecessary DDL statements. Keep in mind that there might be some overlap in the logs for long-running DDL statements, but this should significantly reduce the amount of log data generated.

    Remember to monitor and adjust the settings based on your specific performance and logging requirements. Additionally, consider using tools like pgBadger or pg_stat_statements to analyze and generate reports from your PostgreSQL logs efficiently.

    Lastly, ensure that you have proper backup and testing procedures in place before implementing changes to production configurations.

    Login or Signup to reply.
  2. My first impulse was to get a superuser to set a parameter on your functions, so that only your functions are logged:

    ALTER FUNCTION logme() SET log_min_duration_statement = 0;
    

    But that will fail when the function is called by a non-superuser:

    SELECT logme();
    ERROR:  permission denied to set parameter "log_min_duration_statement"
    

    You could work around that by making all your functions SECURITY DEFINER and have them be owned by a superuser, but that’s certainly not desirable for security reasons.

    The best I can come up with is to add explicit instrumentation to each of your functions:

    CREATE FUNCTION somefunc(...) RETURNS ...
    LANGUAGE plpgsql AS
    $$DECLARE
       tstart timestamp with time zone := clock_timestamp();
       ...
    BEGIN
       ...
       /* either log the duration */
       RAISE LOG 'called function somefunc(): duration %', clock_timestamp() - tstart;
       /* or write it to a table */
       INSERT INTO logtab (funcname, start_time, duration)
       VALUES ('somefunc', tstart, clock_timestamp() - tstart);
    
       RETURN ...;
    END;$$;
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search