I start learning the Azure Logic Apps and my first tasks is to store the result of a specific Kusto query from calling the log analytics of azure https://api.loganalytics.io/v1/workspaces/{guid}/query
.
Currently, I can successfully call the log analytics api using Http
in Logic App and this is the sample return.
{
"tables": [
{
"name": "PrimaryResult",
"columns": [
{
"name": "UserPrincipalName",
"type": "string"
},
{
"name": "OperationName",
"type": "string"
},
{
"name": "ResultDescription",
"type": "string"
},
{
"name": "AuthMethod",
"type": "string"
},
{
"name": "TimeGenerated",
"type": "string"
}
],
"rows": [
[
"[email protected]",
"Fraud reported - no action taken",
"Successfully reported fraud",
"Phone call approval (Authentication phone)",
"22-01-03 [09:01:03 AM]"
],
[
"[email protected]",
"Fraud reported - no action taken",
"Successfully reported fraud",
"Phone call approval (Authentication phone)",
"22-02-19 [01:28:29 AM]"
]
]
}
]
}
From this result, I’m stuck on how should iterate the rows
property of the json result and save those data to Azure Table Storage
which correspond to the columns
property in the json result.
E.g.,
| UserPrincipalName | OperationName | ResultDescription | AuthMethod | TimeGenerated |
---------------------------------------------------------------------------------------------------------------------------------------------------------------
| [email protected] | Fraud reported - no action taken | Successfully reported fraud | Phone call approval (Authentication phone) | 22-01-03 [09:01:03 AM] |
| [email protected] | Fraud reported - no action taken | Successfully reported fraud | Phone call approval (Authentication phone) | 22-02-19 [01:28:29 AM] |
Hope someone can guide me on how to achieve this.
TIA!
2
Answers
Firstly, use a
Parse JSON
action and load your JSON in as sample to generate the schema.Then use a
For each
(rename them accordingly) to traverse therows
, this will then automatically generate an outerFor each
for thetables
.This is the trickier part, you need to generate a payload that contains your data with some specific keys that you can then identify in your storage table.
This is my test flow with your data …
… and this is the end result in storage explorer …
The JSON within the entity field in the
Insert Entity
action looks like this …I simply used GUID’s to make it work but you’d want to come up with some kind of key from your data to make it much more rational. Maybe the date field or something.
You can use
Parse_JSON
to extract the inner data of the output provided and then can useInsert or Replace Entity
action inside afor_each
loop. Here is the screenshot of my logic appIn my storage account
UPDATED ANSWER
Instead of directly using
Insert or Replace Entity
I have Initialised 2 variables and then usedInsert or Merge Entity
. One variable is to iterate inside rows and other is to iterate inside columns usinguntil
loop and fetched the required values from tables. Here is the screenshot of my logic app.In the first until loop, The iteration continues until rows variable is equal to no of rows. Below is the expression:
In the second until loop, The iteration continues until columns variable is equal to no of columns. Below is the expression:
Below is the expression I’m using in
Insert or Merge Entity
‘s entityRESULTS: