skip to Main Content

I need to trigger a python script lying on an Azure Cloud and parsing data in a folder.

How can this script be triggered when a new folder is added to the cloud?

I´m completly clueless how to do this and would be very thankful for any advice or hint.

2

Answers


  1. I’m guessing this folder isn’t a git repo? if it was, you could potentially have a pipeline run your script based on a trigger.

    Wherever this folder is, you could also set a scheduled task/job somewhere that:

    • does a DIR basically
    • finds new folders created in the past X minutes/hours
    • throws the folders into an array then iterates through it, running your script

    You’d need to ensure that your script won’t cause issues if it runs multiple times for the same folder, and also how to deal with failed runs (say it runs every 15 minutes, one run fails, whatever folder created there is then never processed).

    You can also look at something more advanced, like monitoring for changes using .NET: https://www.howtogeek.com/devops/how-to-monitor-a-windows-folder-for-new-files-and-take-action/

    It all depends on where these folders are, how frequently there’ll be new ones, etc

    Login or Signup to reply.
  2. This question is a bit too broad, Azure Cloud Service has too many services.

    I notice you added those tags: azure-functions, azure-devops, azure-pipelines. So I will provide two feasible solutions based on these technical combinations in the following(azure-active-directory is related to authentication, I ignored this tag).

    1, azure blob storage and azure function.

    I notice the azure function tag that you have added, but if you use azure function, you need to make the azure function can monitor the changing on target places. Use the azure blob storage as the source and upload Folder and files to azure blob storage is a possible way. In this situation, you can set up a python blob trigger azure function to run the scripts when files been added.

    Code Sample:

    __init__.py

    import logging
    
    import azure.functions as func
    
    
    def main(myblob: func.InputStream):
        #Put your logic here.
        logging.info(f"Python blob trigger function processed blob n"
                     f"Name: {myblob.name}n"
                     f"Blob Size: {myblob.length} bytes")
    

    function.json

    {
      "scriptFile": "__init__.py",
      "bindings": [
        {
          "name": "myblob",
          "type": "blobTrigger",
          "direction": "in",
          "path": "samples-path/{name}",
          "connection": "bowmanxxx_STORAGE"
        }
      ]
    }
    

    local.settings.json

    {
      "IsEncrypted": false,
      "Values": {
        "AzureWebJobsStorage": "",
        "FUNCTIONS_WORKER_RUNTIME": "python",
        "bowmanxxx_STORAGE": "DefaultEndpointsProtocol=https;AccountName=xxx;AccountKey=xxx;EndpointSuffix=core.windows.net"
      }
    }
    

    The above code and configurations are just for local testing, if you deploy to azure function app, just need to add the connection string ‘bowmanxxx_STORAGE’ of storage account to configuration settings of azure function app is ok.

    2, Azure DevOps git repository and Azure DevOps YAML pipeline.

    The idea of using this combination of technologies is to transfer the folder to the DevOps repository, and then use the CI trigger that comes with the Azure DevOps pipeline in the Azure Git Repository to capture the changes to the specified path.

    azure-pipelines.yml

    trigger:
      branches:
        include:
        - '*'
      paths:
        include:
        - upload_here
    
    pool:
      vmImage: ubuntu-latest
    
    steps:
    - task: PythonScript@0
      inputs:
        scriptSource: 'inline'
        script: |
          #Put your python logic here.
          
          print("Put your python logic on the above.")
    

    Repository Structure:

    enter image description here


    I think the key of the solution of your question is the trigger.

    See the trigger documents of the above technologies:

    Python Blob Storage Trigger

    Azure Git Repository CI Trigger

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search