skip to Main Content

I have a problem. I got the data from oracle on premise by query in LOOKUP ACTIVITY now I want to get this data into the DATA FLOW source so I can append this data into the my master CSV file in BLOb storage.

please help out this

Azure pipeline get output value data from lookup activity to data flow source

2

Answers


  1. You can parameterize based on

    @activity(‘LookupActivityName’).output.value

    Check this out:

    https://learn.microsoft.com/en-us/answers/questions/1136282/pass-lookup-activity-output-to-dataflow

    Login or Signup to reply.
  2. Azure pipeline get output value data from lookup activity to data flow source.

    If you want to do it with only lookup, first get the column values array using a ForEach. then pass that array as an array parameter to dataflow as suggested by @Ziya Mert Karakas.

    In Dataflow use a dummy source and add derived column and unfold the parameter array to get it as a column. You can go through this SO answer by @RithwikBojja to learn more about it.
    But this method only works if there is only a single column in your data. If there are multiple columns in it, unfolding multiple column arrays at a time might not be possible in dataflow.

    As an alternative you can use the below method which involves a temporary file. First copy the Oracle data to a temporary Blob file using copy activity. Now use dataflow after it.

    DATA FLOW source so I can append this data into the my master CSV file in BLOb storage

    But, dataflow will overwrite the sink blob csv file. If you want append to your master csv file, then use two sources first is your master csv file and second one is the temporary blob file. Do a union transformation on these two by Name like below sample.

    My master csv file with some data:

    enter image description here

    Temporary file with new data which we will get from copy activity:

    enter image description here

    Union transformation by Name:

    enter image description here

    Then give the same master csv file dataset as sink. Check on output to single file and give the master csv file name here.

    enter image description here

    Sink result with appended data:

    enter image description here

    After copy activity to temporary file, use the dataflow activity and execute the pipeline to get the data in the master csv file.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search