skip to Main Content

I want to store audit data of some events happening within my API service.

I am planning to have 3 columns for the audit table.

The daily record/rows addition to this audit table won’t exceed more than 100.

This audit table will be rarely accessed and will have very little write operation per day.
I want to minimize the cost, and I think that DynamoDB would be overkill here.

Is there any other choice of storage with RDS, or some other AWS storage service, that I can use to achieve my goal.
I would be writing the data to the audit table through a lambda service.

2

Answers


  1. What makes you think DynamoDB would be overkill? It scales down as well as it scales up. In fact, when your activity rate is low that’s a great time to pick a serverless model because you don’t have to pay except for what you use.

    Your usage would even fit within the free tier.

    Login or Signup to reply.
  2. If you have a workload that’s about 99% writes and 1% reads and doesn’t have extreme read performance requirements as tends to be the case with audit workloads, I suggest you rethink your approach to storing this data.

    Consider having the Lambda function(s) write the audit events to a Kinesis Firehose Stream. Kinesis Firehose can aggregate these records in JSON/Parquet files and store them on S3. When you need to access that data, you can use Athena to query the data in S3.

    This should be significantly cheaper at scale than having a database around that’s not going to be queried most of the time.

    Pricing:

    Note: DynamoDB has a free tier of 25GB in storage costs per month – depending on what’s going on in your account, that may be something to factor in.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search