I want to copy files from A blob storage to blob B, with a condition to copy/write only new files to storage B.
I’m using BlockBlobService package to list all files one blob A, but does this package also have a function to copy/write a new file to another blob storage?? I’m writing this in Python btw..
Please help me out :(…I’m a bit helpless now
I tried to use this package DataLakeServiceClient to write a file to azure blob storage B. But this packaged DataLakeServiceClient is not compatible with BlockBlobService. So I do know what to do.
If you have tried another method to do the same thing I want to do, please share with me your wisdom and knowledge.
2
Answers
I would say, try azcopy tool. It support copying data between storage accounts. For example:
Then, use it with the
--overwrite
flag with valuefalse
orifSourceNewer
to specify the behavior for existing blobs at the destination:See this doc for how to get started.
After reproducing from my end, I could able to achieve this using
get_blob_to_path
andcreate_blob_from_path
of BlockBlobService. Below is the complete code that worked for me.RESULTS: