skip to Main Content

I created "Trigger Azure Functions on blob containers using an event subscription" with visual studio code and I am running it in local, For now I want to read netCDF files. normally I am doing this by :

from netCDF4 import Dataset
     nc_file ='path of .nc file'
     nc = Dataset(nc_file, mode='r')

but now I dont know how to find the path of my file in the container. My init file in the azure function is like this:

import logging 
import azure.functions as func
def main(myblob: func.InputStream):
    logging.info(f"Python blob trigger function processed blob n"
                 f"Name: {myblob.name}n"
                 f"Blob Size: {myblob.length} bytes")

Thank you in advance for your time and concern.

2

Answers


  1. Chosen as BEST ANSWER

    I figured it out thanks to answer of DopplerShift and I am writing it here for the ones who may need it:

    from netCDF4 import Dataset
    fobj = open('path/to/netcdf.nc', 'rb')
    data = fobj.read()
    nc = Dataset('memory', memory=data)
    

  2. but now I dont know how to find the path of my file in the container

    Since you are using blob trigger, you just need to upload a file to the container which you have mentioned in your Azure function’s function.json path. Below is my function.json where the function could able to read the file when uploaded to the container named "container1".

    {
      "scriptFile": "__init__.py",
      "bindings": [
        {
          "name": "myblob",
          "type": "blobTrigger",
          "direction": "in",
          "path": "container1/{name}", /*This is the line that defines the path*/
          "connection": "AzureWebJobsStorage"
        }
      ]
    }
    

    RESULTS:

    enter image description here

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search