skip to Main Content

We have 5 vendors that are SFTPing files to Blob Storage. When the files come in, I need to copy them to another container and create a folder in that container named with the date to put the files in. From the second container, I need to copy the files to a file share on an Azure server. What is the best way to go about this?

I’m very new to Azure and unsure what the best way is to accomplish what I am being asked to do. Any help would be greatly appreciated.

3

Answers


  1. I’d recommend using Azure Synapse for this task. It will let you move data to and from different storage securely and with little-to-no code.

    Specifically, I’d put a blob storage trigger on the SFTP blob container so that the Synapse Pipeline to move data automatically runs when your vendors drop their files.

    Note that when you look for documentation on how to do things in Synapse, most of the time the Azure Data Factory documentation will also be applicable, since most of Data Factory’s functionality is now in Synapse.

    The ADF and Synapse YouTube channels are excellent resources, as well as the Microsoft Learn courses on Data Engineering.

    Login or Signup to reply.
  2. I need to copy them to another container and create a folder in that container named with the date to put the files in.

    You can use Azcopy to copy a files to another container by using SAS token.

    command:

    azcopy copy 'https://<storage account>.blob.core.windows.net/test/files?SAS' 'https://<storage account >.blob.core.windows.net/mycontainer/12-01-2023?SAS' --recursive
    

    Console:
    enter image description here

    Portal:

    enter image description here

    I need to copy the files to a file share on an Azure server

    You can also copy the files from container to file share by using Azcopy.

    Command:

    azcopy copy 'https://<storage account>.blob.core.windows.net/test?SAS' 'https://<storage account >.file.core.windows.net/fileshare/12-01-2023?SAS' --recursive
    

    Console:

    enter image description here

    Portal:

    enter image description here

    You can get the SAS token through portal:

    Go to portal -> your storage account -> shared access signature -> check the resource types -> click generate SAS and Connection-string.

    Portal:

    enter image description here

    Login or Signup to reply.
  3. Probably azcopy is a good way to move all or part of the blobs from one container to another one. But I would suggest to automate it with Azure Functions. I think it can be atomated triggering an Azure Function every time a blob or set of blobs (Azure could process a batch of blobs) are updoladed to the source container.

    Note on Azure Functions, depends on the quantity of blobs to be moved and the time that it could take, durable functions should be better solution to skip timeout exception. Durable function returns inmediate response but are running in "background".

    Consider this article to have a better approach to this solution:
    https://build5nines.com/azure-functions-copy-blob-between-azure-storage-accounts-in-c/

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search