skip to Main Content

Need to move data from one dynamodb table to another table after doing a transformation

What is the best approach to do that

Do I need to write a script to read selective data from one table and put in another table

or Do I need to follow CSV export

2

Answers


  1. You need to write a script to do so. However, you may wish to first export the data to S3 using DynamoDB’s native function as it does not impact capacity on the table, ensuring you do not impact production traffic for example.

    If your table is not serving production traffic or the size of the table is not too large then you can simply use Lambda functions to read your items, transform and then write to the new table.

    If your table is large, you can use AWS Glue to achieve the same result in a distributed fashion.

    Login or Signup to reply.
  2. Is this a live table that is used on prod?

    If it is what I usually do is.

    • Enable Dynamo streams (if not already enabled)
    • Create a lambda function that has access to both tables
    • Place transformation logic in the lambda
    • Subscribe the lambda to the dynamo stream
    • Update all fields on the original table (like update a new field called ‘migrate’)
    • Now all elements will flow through the lambda and it can store them with transformation on the new table
    • You can now switch to the new table
    • Check if everything still works
    • Delete lambda, old table, and disable dynamo streams (if needed)

    This approach is the only one I found that can guarantee 100% uptime during the migration.

    If the table is not live then you can just export it to S3 and then import it into the new table

    https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBPipeline.html

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search