I’m using Azure Data Factory and I have Json files in azure Datalake. Each Json file contain an array of Jsons. I want to copy each Jsons of the array into Azure SQL row. I tried to do it with copy activity, but it automatically flatten the Json and I want to keep it original. I cannot use DataFlow.
The following pic describe what I want to achieve (don’t mention the values of the table)
3
Answers
I figured it out, I used Lookup activity with Json linked service. then, with script activity, I wrote each @item() to the azure sql table row
In order to copy each Json array into a single column of a row, you can
use
openjson
in script activity. Below is the approach.Lookup activity is taken, and Json file is taken as the dataset in the activity.
Then the output of the lookup activity is stored in the variable
Json
of string type. The value is assigned using set variable activity.
Value for variable Json:
@string(activity('Lookup1').output.value)
Script activity is taken and linked service for Azure SQL database is given. Script is given as
This script will insert the data into the table test_tgt which is already created.
SQL table output
I think you have to use dynamic expressions here.
Try to parse the output of your select activity ( the one that fetch the json file from your datalake).
Here is an exemple that would extract two seperate json variables from an array of json on a datalake.
the json file is like :
lookup activity would be as below :
variables are :
output of the lookup activity is :
To parse the array of json (output of the lookup activity) we use the following :
Results :
variable 1 gets :
variable 2 gets:
Hope this would help.