I’ve got a table that looks something like this:
create table room_server_metrics (
namespace varchar(36) not null,
session_id varchar(36) not null,
primary key (namespace, session_id),
created_at timestamptz not null default now(),
updated_at timestamptz not null default now(),
);
When I try to sum the length of all sessions, I get this:
database=> select sum(updated_at - created_at) as total_time from room_server_metrics;
total_time
----------------------------
94 days 60951:01:56.381483
94 days + 60951 hours is ~2633 days.
I wasn’t sure if the print out was showing 94 days as an easier metric to read, but if I print out the delta in seconds, I get this:
database=> select extract(epoch from sum(updated_at - created_at)) as total_time from room_server_metrics;
total_time
----------------------------
227546617.181704
227546617.181704 seconds is also ~2633 days.
Why does the first query not reduce the number of hours past 94 days? I can’t for the life of me figure this one out haha.
Max
2
Answers
Thanks to Adrian's comment and some postgres docs research I found this:
It seems postgres takes each row's timespan and adds up the days/hours/minutes separately. In this case, there were 94 rows with at least 1 day worth of time in the interval that was summed up. Using the
justify_hours
function does indeed solve the problem:extract(epoch from ...)
is going to give garbage data. Epoch timestamps are only meaningful for actual timestamps, not forinterval
types.Try
justify_days
:The PostgreSQL manual has this to say: