skip to Main Content

Would like to know a KQL query to detect changes in ADF triggers.

Looking at Log Analytics since we are sending all logs to LA, I see several tables are created and populated, some are not.

AzureActivity looked promising but I only see Start/Success under ActivityStatusValue for the specific trigger name.

AzureDiagnostics does not appear to contain any data.
AzureMetrics has quite a few Metrics under MetricName but nothing close to what I’m looking for.

Thanks for any help you may provide.

2

Answers


  1. Chosen as BEST ANSWER

    The easy way to do this is to look at the Activity Log right under the Overview menu option for the resource. You can select a time period and you can filter by Operation Name, in my case I searched for Trigger Stop and that gave me the information I needed after downloading the file to a CSV worksheet.


  2. REST API Solution (Using ADF and SQL Server):

    Web Activity JSON:

    {
    "name": "Initial ADF Triggers Logs",
    "type": "WebActivity",
    "dependsOn": [],
    "policy": {
        "timeout": "7.00:00:00",
        "retry": 1,
        "retryIntervalInSeconds": 180,
        "secureOutput": false,
        "secureInput": false
    },
    "userProperties": [],
    "typeProperties": {
        "url": {
            "value": "https://management.azure.com/subscriptions/@{pipeline().parameters.SubscriptionId}/resourceGroups/@{pipeline().parameters.ResourceGroupName}/providers/Microsoft.DataFactory/factories/@{pipeline().parameters.DataFactoryName}/triggers?api-version=2018-06-01",
            "type": "Expression"
        },
        "method": "GET",
        "body": {
            "value": "{n"lastUpdatedAfter": "@{pipeline().parameters.MaxLastUpdateTime}",n"lastUpdatedBefore": "@{utcnow()}",n"filters": []n}",
            "type": "Expression"
        },
        "authentication": {
            "type": "MSI",
            "resource": {
                "value": "https://management.azure.com/",
                "type": "Expression"
            }
        }
    }
    }
    

    The output of that looks like this:

    API Output

    You could then read that into a stored procedure and write to a backend table and do whatever spanning and historical tracking you would like. Here is the Stored Procedure and Table DDL to make it happen:

    Procedure:

    ALTER Procedure [AzureDataFactory].[PopulateTriggerLog_tmp] (
    @json NVARCHAR(MAX),
    @DataFactoryName VARCHAR(50) = NULL
    )
    as
    BEGIN TRY 
    select @json j
    ;
    insert into AzureDataFactory.TriggerLog_tmp (
    TriggerId
    ,TriggerName
    ,TriggerType
    ,frequency
    ,interval
    ,startTime
    ,endTime
    ,timeZone
    ,ScheduleMinutes
    ,ScheduleHours
    ,ScheduleWeekdays
    ,ScheduleMonthDays
    ,MonthlyOccuranceDay
    ,MonthlyOccurance
    ,FirstPipelineName
    ,Descriptions
    ,RunTimeState
    ,etag
    ,DataFactoryName
    ,ETLUpdateTimestamp
    )
    SELECT
        JSON_VALUE ( j.[value], '$.id' ) AS TriggerId,
        JSON_VALUE ( j.[value], '$.name' ) AS TriggerName,
        JSON_VALUE ( j.[value], '$.properties.type' ) AS TriggerType,
        JSON_VALUE ( j.[value], '$.properties.typeProperties.recurrence.frequency' ) AS Frequency,
        JSON_VALUE ( j.[value], '$.properties.typeProperties.recurrence.interval' ) AS Interval,
        JSON_VALUE ( j.[value], '$.properties.typeProperties.recurrence.startTime' ) AS StartTime,
        JSON_VALUE ( j.[value], '$.properties.typeProperties.recurrence.endTime' ) AS EndTime,
        JSON_VALUE ( j.[value], '$.properties.typeProperties.recurrence.timeZone' ) AS TimeZone,
        REPLACE(REPLACE(REPLACE(JSON_QUERY ( j.[value], '$.properties.typeProperties.recurrence.schedule.minutes'), ' ', ''), '[', ''), ']', '') AS ScheduleMinutes,
        REPLACE(REPLACE(REPLACE(JSON_QUERY ( j.[value], '$.properties.typeProperties.recurrence.schedule.hours'), ' ', ''), '[', ''), ']', '') AS ScheduleHours,
        REPLACE(REPLACE(REPLACE(JSON_QUERY ( j.[value], '$.properties.typeProperties.recurrence.schedule.weekDays'), ' ', ''), '[', ''), ']', '') AS ScheduleWeekdays,
        REPLACE(REPLACE(REPLACE(JSON_QUERY ( j.[value], '$.properties.typeProperties.recurrence.schedule.monthDays'), ' ', ''), '[', ''), ']', '') AS ScheduleMonthDays, --Added V2 ijt 12/6/22
        JSON_VALUE ( (JSON_QUERY ( j.[value], '$.properties.typeProperties.recurrence.schedule.monthlyOccurrences[0]')), '$.day') AS MonthlyOccuranceDay, --Added V2 ijt 12/6/22
        JSON_VALUE ( (JSON_QUERY ( j.[value], '$.properties.typeProperties.recurrence.schedule.monthlyOccurrences[0]')), '$.occurrence') AS MonthlyOccurance, --Added V2 ijt 12/6/22
        JSON_VALUE ( (JSON_QUERY ( j.[value], '$.properties.pipelines[0]')), '$.pipelineReference.referenceName') AS FirstPipelineName,
        JSON_VALUE ( j.[value], '$.properties.description' ) AS descriptions,
        JSON_VALUE ( j.[value], '$.properties.runtimeState' ) AS runtimeState,
        JSON_VALUE ( j.[value], '$.etag' ) AS etag,
        @DataFactoryName AS DataFactoryName,
        getdate() as ETLUpdateTimestamp
    FROM OPENJSON( @json ) j
    ;
    END TRY
    BEGIN CATCH
     IF @@TRANCOUNT > 0
        ROLLBACK;
        THROW;
    END CATCH
    

    Table DDL:

    CREATE TABLE [AzureDataFactory].[TriggerLog_tmp](
    [TriggerId] [varchar](250) NOT NULL,
    [TriggerName] [varchar](100) NULL,
    [TriggerType] [varchar](25) NULL,
    [frequency] [varchar](10) NULL,
    [interval] [int] NULL,
    [startTime] [datetime] NULL,
    [endTime] [datetime] NULL,
    [timeZone] [varchar](50) NULL,
    [ScheduleMinutes] [varchar](500) NULL,
    [ScheduleHours] [varchar](500) NULL,
    [ScheduleWeekdays] [varchar](500) NULL,
    [ScheduleMonthDays] [varchar](50) NULL,
    [MonthlyOccuranceDay] [varchar](500) NULL,
    [MonthlyOccurance] [int] NULL,
    [Descriptions] [varchar](800) NULL,
    [RunTimeState] [varchar](500) NULL,
    [FirstPipelineName] [varchar](50) NULL,
    [etag] [varchar](50) NOT NULL,
    [DataFactoryName] [varchar](50) NULL,
    [ETLUpdateTimestamp] [datetime] NULL
    ) ON [PRIMARY]
    

    This I know is not through Log Analytics, but I know from experience it is a highly effective way of tracking triggers across ALL Azure Data Factories, which is imperative once you have dozens of ADF’s to support. It’s always fun when your director asks you, is there anything running in ADF right now…

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search