skip to Main Content

Update on 25 June – 2 (SOLVED)

How could I have been so foolish…
I set up a secret generic and modified the deployment.yaml to assign variables. And it worked…
To sum up my questions, all were caused by not setting up the environment variables properly.


Update on 25 June – 2

After the 403 Forbidden problem with the Web APP has been fixed, however, I am now encountering the same problem in AKS. I want the containers in AKS to send logs to the same custom table as the Web APP, but they are showing the same error in the error message.

enter image description here


Update on 25 June – 1

I have successfully sent logs to a custom table in Log Analytics.
This was after I realized I had missed setting up the environment variables, as shown in the screenshot below.
After setting up the environment variables, it works to send logs to custom table. 🙂

Environment variables


I have published a containerized Web App on Azure running Linux OS, coded in C# .NET.
And I want to send my app logs to custom table in Azure Log Analytics.

Therefore, I follow the tutorials on Tutorial: Send data to Azure Monitor using Logs ingestion API (Resource Manager templates) and Sample code to send data to Azure Monitor using Logs ingestion API to set-up Azure environment (DCR, DCE, creating a new table, App registration) and write .NET code to try send data to my custom table.

Despite numerous attempts, I consistently received a 403 (Forbidden) error code. Even though I have waited over 30 minutes (according to troubleshooting), the issue persists.

Error messages from Azure Log stream as belowed:

Upload failed with Exception The authentication token provided does not have access to ingest data for the data collection rule with immutable Id ‘dcr-‘.

Status: 403 (Forbidden)

ErrorCode: OperationFailed

Content:
{"error":{"code":"OperationFailed","message":"The authentication token provided does not have access to ingest data for the data collection rule with immutable Id ‘dcr-‘."}}

If anyone has experienced similar issues or has any suggestions, I would be very appreciated!

2

Answers


  1. Chosen as BEST ANSWER

    Thanks to Pravallika for taking the time to replicate my case. I figured out the solution after carefully reviewing the documentation again. Not only Web APP but also AKS, all questions are about not setting up the environment variables.

    。For the Web APP

    The key point is not to overlook setting the Environment variables enter image description here

    。For AKS

    Step 1. Create Kubernetes Secret

    kubectl create secret generic azure-secrets 
    --from-literal=azure-client-id='<your-azure-client-id>' 
    --from-literal=azure-client-secret='<your-azure-client-secret>' 
    --from-literal=azure-tenant-id='<your-azure-tenant-id>' 
    --namespace your-namespace
    

    Step 2. Use Secret in the setting of Pod

    apiVersion: apps/v1
    kind: Deployment
    metadata:
      name: my-application
      namespace: your-namespace
    spec:
      replicas: 1
      selector:
        matchLabels:
          app: my-application
      template:
        metadata:
          labels:
            app: my-application
        spec:
          containers:
          - name: my-container
            image: myimage
            env:
              - name: AZURE_CLIENT_ID
                valueFrom:
                  secretKeyRef:
                    name: azure-secrets
                    key: azure-client-id
              - name: AZURE_CLIENT_SECRET
                valueFrom:
                  secretKeyRef:
                    name: azure-secrets
                    key: azure-client-secret
              - name: AZURE_TENANT_ID
                valueFrom:
                  secretKeyRef:
                    name: azure-secrets
                    key: azure-tenant-id
    

    Step 3. Apply the deployment.

    Everything is now perfectly set up.


  2. I have followed the MSDoc to configure the prerequisites required to send the data to Azure monitor logs using log ingestion API.

    Provide access to the registered application and your user account from the DCR as mentioned in MSDOC.

    Navigate to your Data Collection Rule(DCR) instance=>Access Control(IAM)=>Add Role Assignment=> Select Monitoring Metrics Publisher role=> Select User, Group or Service principal=> Select your Application and the user account.

    enter image description here

    enter image description here

    • After assigning the role, wait for 30 to 45 min and execute the code.

    I am able to run the code and got the expected response.

    Code Snippet:

    // Initialize variables
    var endpoint = new Uri("<DCE_Endpoint>");
    var ruleId = "<dcr-Immutable ID>";
    var streamName = "Custom-MyTableRawData";// Stream Name
    
    // Create credential and client
    var credential = new DefaultAzureCredential();
    LogsIngestionClient client = new(endpoint, credential);
    
    DateTimeOffset currentTime = DateTimeOffset.UtcNow;
    
    BinaryData data = BinaryData.FromObjectAsJson(
      new[] {
        new
        {
          Time = currentTime,
          Computer = "Computer1",
          AdditionalContext = new
          {
            InstanceName = "user1",
            TimeZone = "Pacific Time",
            Level = 4,
            CounterName = "AppMetric1",
            CounterValue = 15.3
          }
        },
        new
        {
          Time = currentTime,
          Computer = "Computer2",
          AdditionalContext = new
          {
            InstanceName = "user2",
            TimeZone = "Central Time",
            Level = 3,
            CounterName = "AppMetric1",
            CounterValue = 23.5
          }
        },
      });
    
    // Uploading logs
    try
    {
        var response = await client.UploadAsync(ruleId, streamName, RequestContent.Create(data)).ConfigureAwait(false);
        if (response.IsError)
        {
            throw new Exception(response.ToString());
        }
    
        Console.WriteLine("Log upload completed using content upload");
    }
    catch (Exception ex)
    {
        Console.WriteLine("Upload failed with Exception: " + ex.Message);
    }
    
    // Logs can also be uploaded in a List
    var entries = new List<object>();
    for (int i = 0; i < 10; i++)
    {
        entries.Add(
            new
            {
                Time = currentTime,
                Computer = "Computer" + i.ToString(),
                AdditionalContext = new
                {
                    InstanceName = "user" + i.ToString(),
                    TimeZone = "Central Time",
                    Level = 3,
                    CounterName = "AppMetric1" + i.ToString(),
                    CounterValue = i
                }
            }
        );
    }
    
    // Make the request
    try
    {
        var response = await client.UploadAsync(ruleId, streamName, entries).ConfigureAwait(false);
        if (response.IsError)
        {
            throw new Exception(response.ToString());
        }
    
        Console.WriteLine("Log upload completed using list of entries");
    }
    catch (Exception ex)
    {
        Console.WriteLine("Upload failed with Exception: " + ex.Message);
    }
    

    Console Response:

    enter image description here

    Logs Uploaded to the Custom Table in Log Analytics Workspace:

    enter image description here

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search