I need to trigger a python script lying on an Azure Cloud and parsing data in a folder.
How can this script be triggered when a new folder is added to the cloud?
I´m completly clueless how to do this and would be very thankful for any advice or hint.
2
Answers
I’m guessing this folder isn’t a git repo? if it was, you could potentially have a pipeline run your script based on a trigger.
Wherever this folder is, you could also set a scheduled task/job somewhere that:
You’d need to ensure that your script won’t cause issues if it runs multiple times for the same folder, and also how to deal with failed runs (say it runs every 15 minutes, one run fails, whatever folder created there is then never processed).
You can also look at something more advanced, like monitoring for changes using .NET: https://www.howtogeek.com/devops/how-to-monitor-a-windows-folder-for-new-files-and-take-action/
It all depends on where these folders are, how frequently there’ll be new ones, etc
This question is a bit too broad, Azure Cloud Service has too many services.
I notice you added those tags: azure-functions, azure-devops, azure-pipelines. So I will provide two feasible solutions based on these technical combinations in the following(azure-active-directory is related to authentication, I ignored this tag).
1, azure blob storage and azure function.
I notice the azure function tag that you have added, but if you use azure function, you need to make the azure function can monitor the changing on target places. Use the azure blob storage as the source and upload Folder and files to azure blob storage is a possible way. In this situation, you can set up a python blob trigger azure function to run the scripts when files been added.
Code Sample:
__
init__
.pyfunction.json
local.settings.json
The above code and configurations are just for local testing, if you deploy to azure function app, just need to add the connection string ‘bowmanxxx_STORAGE’ of storage account to configuration settings of azure function app is ok.
2, Azure DevOps git repository and Azure DevOps YAML pipeline.
The idea of using this combination of technologies is to transfer the folder to the DevOps repository, and then use the CI trigger that comes with the Azure DevOps pipeline in the Azure Git Repository to capture the changes to the specified path.
azure-pipelines.yml
Repository Structure:
I think the key of the solution of your question is the trigger.
See the trigger documents of the above technologies:
Python Blob Storage Trigger
Azure Git Repository CI Trigger