skip to Main Content

Here is my situation. Iam using Alteryx ETL tool where in basically we are appending new records to tableau by using option provided like ‘Overwrite the file’.

What it does is any data incoming is captured to the target and delete the old data–> publish results in Tableau visualisation tool.

So whatever data coming in source must overwrite the existing data in Sink table.

How can we achieve this in Azure data Flow?

2

Answers


  1. If your requirement is just to copy data from your source to target and truncate the table data before the latest data is copied, then you can just use a copy activity in Azure Data factory. In copy activity you have an option called Pre-copy script, in which you can specify a query to truncate the table data and then proceed with copying the latest data.

    enter image description here

    Here is an article by a community volunteer where a similar requirement has been discussed with various approaches – How to truncate table in Azure Data Factory

    In case if your requirement is to do data transformation first and then copy the data to your target sql table and truncate table before your copy the latest transformed data then, you will have to use mapping data flow activity.
    enter image description here

    Login or Signup to reply.
  2. If you are writing to a database table, you’ll see a sink setting for "truncate table" which will remove all previous rows and replace them with your new rows. Or if you are trying to overwrite just specific rows based on a key, then use an Alter Row transformation and use the "Update" option.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search