skip to Main Content

I have Python 3.9 code and an Azure DevOps pipeline where it builds and deploy the code to Azure Functions.

The pipeline has been working fine for the past several weeks (it’s new) and there was no deployment recently until I had to fix a few lines of code, then I pushed to Azure Repo which triggers the pipeline. But this time the Function crashed when triggered; this is the error message:

Exception: ModuleNotFoundError: No module named ‘holidays’. Cannot
find module. Please check the requirements.txt file for the missing
module.

This is my requirements.txt file, which clearly states the holidays module version.

azure-functions
azure-functions-durable
azure-identity==1.14.0
azure-keyvault-secrets==4.7.0
azure-storage-blob==12.17.0
dotmap==1.3.30
holidays==0.32
json5==0.9.14
numpy==1.25.2
pandas==2.1.0
pyodbc==4.0.39
pytz==2023.3.post1
requests_oauthlib==1.3.1
sqlalchemy==2.0.20

If I deploy from VS Code using Azure Functions extension, the code runs fine without any error, so I’m suspecting that the DevOps pipeline is acting weird (despite nobody changed anything on the YAML file)

This is my azure-pipelines.yml file

trigger:
  - main
  
variables:
  azureSubscription: '<REDACTED>'
  functionAppName: <REDACTED>
  functionAppProjectPath: $(System.DefaultWorkingDirectory)/
  pythonVersion: '3.9'
  vmImage: ubuntu-latest

stages:
- stage: Build
  displayName: Build Stage
  jobs:
    - job: Build
      displayName: Build
      pool:
        vmImage: $(vmImage)
      
      steps:
        - task: UsePythonVersion@0
          inputs:
            versionSpec: $(pythonVersion)

        - bash: |
            pip install -r requirements.txt
          workingDirectory: $(functionAppProjectPath)
          displayName: 'Install dependencies'

        - task: ArchiveFiles@2
          displayName: 'Archive files'
          inputs:
            rootFolderOrFile: $(functionAppProjectPath)
            includeRootFolder: false
            archiveType: zip
            archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
            replaceExistingArchive: true

        - publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
          artifact: drop
- stage: Deploy
  displayName: Deploy Stage
  dependsOn: Build
  condition: succeeded()
  jobs:
    - deployment: Deploy
      displayName: Deploy
      environment: <REDACTED>
      pool:
        vmImage: $(vmImage)
      strategy:
        runOnce:
          deploy:
            steps:
              - task: AzureFunctionApp@2
                displayName: 'Azure Function App Deploy'
                inputs:
                  appType: functionAppLinux
                  appName: $(functionAppName)
                  azureSubscription: $(azureSubscription)
                  package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'

I don’t know what’s wrong here. Hoping someone could point the problem out. Thanks.

2

Answers


  1. In order to resolve ModuleNotFound error make sure you use the step below correctly in your Azure Devops yaml pipeline:-

    - bash: |
            pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt
          workingDirectory: $(workingDirectory)
          displayName: 'Install application dependencies'
    

    Make sure you are selecting correct Branch that includes your Function code for your Pipeline:-

    enter image description here

    Azure Repository:-

    enter image description here

    Complete yaml pipeline code for Durable Function:-

    trigger:
    - master
    
    variables:
      azureSubscription: '7bxxxxxxxxb50eee'
    
      functionAppName: 'valleyfunc8'
    
      vmImageName: 'ubuntu-latest'
    
      workingDirectory: '$(System.DefaultWorkingDirectory)/'
    
    stages:
    - stage: Build
      displayName: Build stage
    
      jobs:
      - job: Build
        displayName: Build
        pool:
          vmImage: $(vmImageName)
    
        steps:
        - bash: |
            if [ -f extensions.csproj ]
            then
                dotnet build extensions.csproj --runtime ubuntu.16.04-x64 --output ./bin
            fi
          workingDirectory: $(workingDirectory)
          displayName: 'Build extensions'
    
        - task: UsePythonVersion@0
          displayName: 'Use Python 3.9'
          inputs:
            versionSpec: 3.9 
    
        - bash: |
            pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt
          workingDirectory: $(workingDirectory)
          displayName: 'Install application dependencies'
    
        - task: ArchiveFiles@2
          displayName: 'Archive files'
          inputs:
            rootFolderOrFile: '$(workingDirectory)'
            includeRootFolder: false
            archiveType: zip
            archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
            replaceExistingArchive: true
    
        - publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
          artifact: drop
    
    - stage: Deploy
      displayName: Deploy stage
      dependsOn: Build
      condition: succeeded()
    
      jobs:
      - deployment: Deploy
        displayName: Deploy
        environment: 'development'
        pool:
          vmImage: $(vmImageName)
    
        strategy:
          runOnce:
            deploy:
    
              steps:
              - task: AzureFunctionApp@1
                displayName: 'Azure functions app deploy'
                inputs:
                  azureSubscription: '$(azureSubscription)'
                  appType: functionAppLinux
                  appName: $(functionAppName)
                  package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'
    

    My requirements.txt:-

    azure-functions
    azure-functions-durable
    azure-identity==1.14.0
    azure-keyvault-secrets==4.7.0
    azure-storage-blob==12.17.0
    dotmap==1.3.30
    holidays==0.32
    json5==0.9.14
    numpy==1.25.2
    pandas==2.1.0
    pyodbc==4.0.39
    pytz==2023.3.post1
    requests_oauthlib==1.3.1
    sqlalchemy==2.0.20
    

    Sample Durable Function code:-

    import logging
    import json
    
    from azure.durable_functions import DurableOrchestrationContext, Orchestrator
    import azure.functions as func
    from azure.identity import DefaultAzureCredential
    from azure.keyvault.secrets import SecretClient
    from azure.storage.blob import BlobServiceClient
    import numpy as np
    import pandas as pd
    
    # Orchestrator Function
    def orchestrator_function(context: DurableOrchestrationContext):
        result1 = yield context.call_activity('HelloSecret', "Tokyo")
        result2 = yield context.call_activity('HelloBlob', "Seattle")
        result3 = yield context.call_activity('HelloData', "London")
        return [result1, result2, result3]
    
    main = Orchestrator.create(orchestrator_function)
    
    # Activity Function to Read Secret
    def hello_secret_activity(name: str) -> str:
        logging.info(f"Reading secret for: {name}")
    
        # Initialize the DefaultAzureCredential
        credential = DefaultAzureCredential()
        
        # Access a secret from Azure Key Vault
        key_vault_url = "https://<your-key-vault-name>.vault.azure.net/"
        secret_client = SecretClient(vault_url=key_vault_url, credential=credential)
        secret_name = "<your-secret-name>"
        secret = secret_client.get_secret(secret_name)
        
        return f"Secret for {name}: {secret.value}"
    
    # Activity Function to List Blobs
    def hello_blob_activity(name: str) -> str:
        logging.info(f"Listing blobs for: {name}")
    
        # Initialize the DefaultAzureCredential
        credential = DefaultAzureCredential()
        
        # Access Azure Blob Storage
        blob_service_client = BlobServiceClient(account_url="https://<your-storage-account-name>.blob.core.windows.net/", credential=credential)
        container_client = blob_service_client.get_container_client("your-container-name")
        blob_list = container_client.list_blobs()
        blob_names = [blob.name for blob in blob_list]
        
        return f"Blobs for {name}: {blob_names}"
    
    # Activity Function to Perform Data Processing
    def hello_data_activity(name: str) -> str:
        logging.info(f"Performing data processing for: {name}")
    
        # Example usage of pandas and numpy
        data = {
            'A': np.random.rand(10),
            'B': np.random.rand(10),
        }
        df = pd.DataFrame(data)
        df_summary = df.describe().to_json()
        
        return f"Data summary for {name}: {df_summary}"
    
    # HTTP Trigger to Start the Orchestration
    async def http_start(req: func.HttpRequest, starter: str) -> func.HttpResponse:
        client = df.DurableOrchestrationClient(starter)
        
        instance_id = await client.start_new(req.route_params["functionName"], None, None)
    
        logging.info(f"Started orchestration with ID = '{instance_id}'.")
    
        return func.HttpResponse(f"Started orchestration with ID = '{instance_id}'.", status_code=202)
    

    Sample Http Trigger code:-

    import logging
    import azure.functions as func
    from azure.identity import DefaultAzureCredential
    from azure.keyvault.secrets import SecretClient
    from azure.storage.blob import BlobServiceClient
    from dotmap import DotMap
    import holidays
    import json5
    import numpy as np
    import pandas as pd
    import pyodbc
    import pytz
    from requests_oauthlib import OAuth2Session
    from sqlalchemy import create_engine
    
    def main(req: func.HttpRequest) -> func.HttpResponse:
        logging.info('Python HTTP trigger function processed a request.')
    
        # Initialize the DefaultAzureCredential
        credential = DefaultAzureCredential()
        
        # Access a secret from Azure Key Vault
        key_vault_url = "https://siliconkeyvault9.vault.azure.net/"
        secret_client = SecretClient(vault_url=key_vault_url, credential=credential)
        secret_name = "secret3"
        secret = secret_client.get_secret(secret_name)
        logging.info(f"Secret: {secret.value}")
    
        # Access Azure Blob Storage
        blob_service_client = BlobServiceClient(account_url="https://siliconstrg8.blob.core.windows.net/", credential=credential)
        container_client = blob_service_client.get_container_client("your-container-name")
        blob_list = container_client.list_blobs()
        blob_names = [blob.name for blob in blob_list]
        logging.info(f"Blobs in container: {blob_names}")
    
        # Example usage of pandas and numpy
        data = {
            'A': np.random.rand(10),
            'B': np.random.rand(10),
        }
        df = pd.DataFrame(data)
        df_summary = df.describe().to_json()
        
        # Example usage of holidays
        us_holidays = holidays.US(years=2024)
        holidays_list = [str(date) for date in us_holidays]
    
        response = {
            "secret_value": secret.value,
            "blobs": blob_names,
            "data_summary": json5.loads(df_summary),
            "holidays": holidays_list
        }
    
        return func.HttpResponse(json5.dumps(response), mimetype="application/json")
    

    Sample yaml code for Http Trigger code:-

    trigger:
    - main
    
    variables:
      azureSubscription: '4d3e6xxxxxxf22707ba'
    
      functionAppName: 'siliconfuncapp'
    
      vmImageName: 'ubuntu-latest'
    
      workingDirectory: '$(System.DefaultWorkingDirectory)/'
    
    stages:
    - stage: Build
      displayName: Build stage
    
      jobs:
      - job: Build
        displayName: Build
        pool:
          vmImage: $(vmImageName)
    
        steps:
        - bash: |
            if [ -f extensions.csproj ]
            then
                dotnet build extensions.csproj --runtime ubuntu.16.04-x64 --output ./bin
            fi
          workingDirectory: $(workingDirectory)
          displayName: 'Build extensions'
    
        - task: UsePythonVersion@0
          displayName: 'Use Python 3.9'
          inputs:
            versionSpec: 3.9 
        - bash: |
            pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt
          workingDirectory: $(workingDirectory)
          displayName: 'Install application dependencies'
    
        - task: ArchiveFiles@2
          displayName: 'Archive files'
          inputs:
            rootFolderOrFile: '$(workingDirectory)'
            includeRootFolder: false
            archiveType: zip
            archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
            replaceExistingArchive: true
    
        - publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
          artifact: drop
    
    - stage: Deploy
      displayName: Deploy stage
      dependsOn: Build
      condition: succeeded()
    
      jobs:
      - deployment: Deploy
        displayName: Deploy
        environment: 'development'
        pool:
          vmImage: $(vmImageName)
    
        strategy:
          runOnce:
            deploy:
    
              steps:
              - task: AzureFunctionApp@1
                displayName: 'Azure functions app deploy'
                inputs:
                  azureSubscription: '$(azureSubscription)'
                  appType: functionAppLinux
                  appName: $(functionAppName)
                  package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'
    

    Output:-

    For Durable Function:-

    holidays package is installed correctly:-

    enter image description here

    enter image description here

    For Http Trigger:-

    holidays package is installed correctly:-

    enter image description here

    enter image description here

    Login or Signup to reply.
  2. I’m using the similar yaml to SiddheshDesai and it works fine. If you still have the same error after trying pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt, try the steps below to narrow down the issue.

    1. Download the artifact from the pipeline. ZIP deploy your function using Azure CLI az functionapp deployment source config-zip -g <resource_group> -n <app_name> --src <zip_file_path>.
    2. If there is the same error in your function, the issue would be related to your artifact. Check the debug log when running pip install and see if holidays is installed like:
    Collecting holidays==0.32
      Downloading holidays-0.32-py3-none-any.whl (754 kB)
         ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 754.4/754.4 KB 30.2 MB/s eta 0:00:00
    
    1. If you can’t see holidays in the log, try to run pip install holidays==0.32 in your pipeline and check if it works.
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search