skip to Main Content

I have few folders inside the Data lake (Example:Test1 container) that are created every month in this format YYYY-MM (Example:2022-11) and inside this folder I have few set of data files, I want to copy this data files to different folders in the data lake.

And again in the next month new folder is created in the same data lake (Example:Test1 container) with 2022-12 and list goes on, 2023-01…..etc., I want to copy files inside these folders every month to different data lake folder.

How to achieve this?

2

Answers


  1. Chosen as BEST ANSWER

    Solution is mentioned in this thread, Create a folder based on date (YYYY-MM) using Data Factory?

    Follow the Sink Dataset section and Copy Sink section....remove the parameter sinkfilename from the dataset, and use this dataset as source in the copy activity.

    It worked for me.


  2. Alternative approach. For reading folders with Date format as (YYYY-MM)

    I reproduce the same in my environment with copy activity.

    • Open sink dataset and create a parameter with Name: Folder.

    enter image description here

    enter image description here

    Go to Connection and Add this dynamic content: @dataset().folder

    enter image description here

    You can Add this dynamic content:

    @concat(formatDateTime(utcnow(), 'yyyy/MM'))

    Or

    @concat(formatDateTime(utcnow(), 'yyyy'), '/',formatDateTime(utcnow(),'MM')
    

    enter image description here

    Pipeline successfully executed and got the output:

    enter image description here

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search