I am working with an Azure Database for PostgreSQL Flexible Server that contains a table where new data is inserted daily. Over time, this has led to the accumulation of millions of rows, and the process of writing data to this table has significantly slowed down.
To address this issue, I’m considering implementing a strategy where:
-
Historical Data Archiving: I create a separate history table that stores all older data.
-
Data Retention: The primary table will only hold data from the last 30 days.
My questions are:
-
Is this a good approach for improving write performance?
-
What other strategies could I consider to optimize database performance in this scenario?
-
Are there specific considerations or best practices I should be aware of when implementing a history table in PostgreSQL, especially in an Azure environment?
Any insights, alternative solutions, or recommendations would be greatly appreciated. Thank you!
I also considered to process the data in batches.
2
Answers
column screenshot for 1st table
Column screenshot for 2nd table
verbose for 2nd table verbose for 1st table