skip to Main Content

Similar to a previous question related to reading logs from Azure Storage Accounts for Fluentd/Fluentbit, I am searching for a more direct way to read logs from Azure Storage Accounts into Loki.

When searching, I found Amazon S3 plugins for Fleuntd/bit and Logstash, nothing for Azure Storage Accounts. Only Sumo Logic seems to support streaming logs from azure storage accounts.

Here seems to be a possibility to read Azure logs into Loki from Azure Event Hubs

I have implemented Java Azure Functions that get triggered when a storage event fires on the containers of the Azure Storage Account. Then, I am going to process and push the logs from the log files (in Json) to Loki via its API, or perhaps use a Java client help me with the pushing of the log lines to Loki.

My question is if someone got a better idea, like having plug-ins similar to the ones provided for Amazon S3 from Fluentbit/Fluentd or Logstash. When possible, I want to avoid using Azure Functions for Java.

Also, would it be better (when using Loki) to get the logs sent to an Azure Event Hub rather than Storage Accounts? The cost factor is critical for me, therefore I first opted to using Storage Accounts and not event hubs.

2

Answers


  1. Chosen as BEST ANSWER

    Following Naveen's advice, I firstly used Event Subscriptions and system topics (Azure Event Grid) to receive events when blob files get added to storage accounts.

    Configuring the "azure_event_hubs" as above allowed Promtail to forward logs to Loki whenever blob events trigger, but this meant I did not get the content of the logs stored as JSON files on the storage containers.

    In order to get the Azure diagnostic log entries, I chose "Stream to an event hub" option in Azure Diagnostic settings: eh as destination for diagnostic logs

    Then I adjusted the Promtail configuration as follows:

    - job_name: azure_event_hubs
    azure_event_hubs:
      fully_qualified_namespace: ehns.servicebus.windows.net:9093
      connection_string: connection-string
      event_hubs: eh-name
      labels:
        job: azure_event_hub
    relabel_configs:
      - action: replace
        source_labels:
          - __azure_event_hubs_category
        target_label: category
    

    as in promtail config

    I did not need a forward_to attribute, because I do not use Grafana flow.

    I am now able to receive Azure Diagnostic Logs directly from Azure in Loki and can query them in Grafana.

    The only disadvantage to this is that basic pricing plan Azure Event Hubs cannot be used, but rather standard pricing and above.

    While this approach solves my problem indeed, I still want to minimize the costs. Therefore, I will search if I can replace Azure Even Hub (as a destination for azure Diagnostic Logs) with: 1- a Kafka instance. 2- or by archiving in storage accounts (most cost-effective approach) and find a way to extract the log entries from there and ingest them to Loki.

    Azure diagnostic settings allow sending to a partner solution, but I have not tried it out yet.


  2. steps to Ingesting logs from Azure Blob Storage to Loki

    • create a Event Hubs Namespace and Event Hub
      In azure portal go the Azure Blob Storage and select Events on the left menu, and then select + Event Subscription on the toolbar.

    • Enter a name for the event subscription.

    • Enter a name for the system topic. A system topic provides an endpoint for the sender to send events. For more information, see System topics

    • select the end point type as Event hub

    enter image description here

    • Event subscription triggers when actions, such as those involving blobs, file shares, queues, tables, and any action with their respective blobs, occur and sends them to the Event Hub.

    enter image description here

    Please refer to this link for Azure Event Hubs to Loki and link1 to read logs from Azure Storage Accounts into Loki.

    Steps to connect even hub to loki:

    • Go the event hub and next, decide on the authentication method you’ll use: either OAuth or connection string. If using OAuth, make sure you have the required credentials set up. If using a connection string, obtain it from Azure.

    • In your Loki configuration file, set up the loki.source.azure_event_hubs component. Specify the Event Hub’s namespace, list the Event Hubs you want to consume, and define where the logs will be forwarded.

    • Configure the authentication block with your chosen method and credentials. You can also customize other settings like group ID, relabeling rules, and whether to use incoming timestamps.

    • Ensure the destination specified in forward_to is correctly configured to receive logs. This destination could be an instance of LogsReceiver.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search