skip to Main Content

I am accepting multiple zip file which I want to process in orchestrator. My durable orchestrator is httptriggered.

I am able to access the file in http trigger as a multipartmemorystream but when I pass the same to durable orchestrator , orchestrator triggers but unable to get files for further processing.

Below is my http trigger function code to read the multiple files and pass to orchestrator

 var data  =  req.Content.ReadAsMultipartAsync().Result;
string instanceId = await starter.StartNewAsync("ParentOrchestrator", data);

Orchestrator Trigger code:

 public static async Task<List<string>> RunOrchestrator(
        [OrchestrationTrigger] IDurableOrchestrationContext context
     )
    {
        var files = context.GetInput<System.Net.Http.MultipartMemoryStreamProvider>();

To read the input I also tried to created class and pass the stream to the property so data can be serialized as JSON but did not work.
anything I am missing in code?
issue is how to get the zip files for processing.

I checked raw input under the orchestrator context , There I can see file name and other details

2

Answers


  1. Chosen as BEST ANSWER

    Orchestrator accept only the data which can be serialised. As memory stream is not serialisable it was not able to retrieve the data using GetInput<provider>(). I convert the memory stream to byte array as byte array can be serialised. I read multiple article which was claiming that ,if we convert the file to byte array we loss the file metadata. Actually if you read file as stream and then to byte array then file data along with meta data get converted to byte array. Here ,

    1. read the httprequest message as multipartread this gives the object as multipartmemoryatreamprovider.
    2. convert data to byte array
    3. pass to orchestrator 4)receive the files as byte array by using GetInput<byte[]>()
    4. In orchestrator convert byte array to stream MemoryStream ms = new MemoryStream(<input byte array>)

  2. Passing files as input seems like a bad idea to me.
    Those inputs will be loaded by the orchestrator from Table Storage/Blob Storage each time it replays.
    Instead I would recommend that you upload the Zip files to Blob Storage and pass the blob URLs as input to the orchestrator.
    Then you use the URLs as inputs to activities where the files are actually processed.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search