skip to Main Content

I have a Function App (Python) that should read from Container1 in Blob Storage, and then write something back to Container2. Both containers are in the same Storage Account and subject to the same permissions.

When I run the Function locally, with Azure Functions Core Tools, everything works fine. When I run upload the Function to Azure and run it from there, it successfully reads from the Blob, but does not write to it. Given the code works locally, I think the Function App code is ok, so it must be a discrepancy between my local environment and what’s in Azure, but cannot work out what.

I’ve checked everything I can think of. I’ve recreated everything from scratch in a new Resource Group and get the same problem. There are no error messages that I can see.

I’ve got AppInsights running, but not seeing anything in there.

Has anyone had a similar issue or have any ideas of what to check?

2

Answers


  1. Chosen as BEST ANSWER

    Found the problem. I hadn't realised that Function Apps in the consumption plan automatically time out after 5 mins. This is not enforced in Azure Function Core Tools.

    As my Function uploads a file to another service, then waits 5 mins before checking the output, it was always going to time out.

    The fix was to set the timeout value in host.json to a more suitable value. I went for 10 mins which is the max for the consumption plan (as per this site).


  2. I have reproduced in my environment and got expected results as below:

    Below is the code which worked for me:

    init.py:

        import logging
        import azure.functions as func
        from azure.storage.blob import BlobServiceClient
        import os    
        
        def main(req: func.HttpRequest) -> func.HttpResponse:
            try:
                con = os.environ["AzureWebJobsStorage"]
                bsc = BlobServiceClient.from_connection_string(con)
                source_con = "<source_container>"
                destination_con= "<destination_container>"
                bn = "<blob_name>"
                scc = bsc.get_container_client(source_con)
                sbc= scc.get_blob_client(bn)
                blob_content = sbc.download_blob().readall()
                dcc = bsc.get_container_client(destination_con)
                dbc = dcc.get_blob_client(bn)
                dbc.upload_blob(blob_content, overwrite=True)
                return func.HttpResponse("Blob content copied successfully!", status_code=200)
            except Exception as e:
                logging.error(f"An error occurred: {e}")
                return func.HttpResponse("Internal server error", status_code=500)
    

    local.settings.json:

        {  
        "IsEncrypted": false,  
        "Values": {  
        "AzureWebJobsStorage": "<connection_string og blob storage>",  
        "FUNCTIONS_WORKER_RUNTIME": "python"  
        }  
        }
    

    Then successfully executed locally:

    enter image description here

    enter image description here

    enter image description here

    Then Deployed it:

    enter image description here

    Blob Uploaded:

    enter image description here

    Even After deploying the function is copying data to destination container and it worked as expected:

    enter image description here

    Note: Also check if the connection is right as below:

    enter image description here

    Output:

    Blob copied to destination container:

    enter image description here

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search