Azure Integration for injecting logs and metrics properly from an Azure event hub?


  1. I'm using an "Azure Event Hub Input" integration in Kibana for trying to inject logs and metrics previously saved in an Azure Event Hub. I select all checks in Azure for sending all logs and metrics to that Event Hub but in Kibana I can see only logs are injected but there are no documents about metrics.
    Is there any Azure integration inside Kibana for injecting metrics coming from an Event Hub as well?

  2. The logs that I can inject using the integration have all the important data in a field called "message". This is how the field looks like:

{"ActivityId":"00000000-0000-0000-0000-000000000000","EventId":15000,"Level":4,"Pid":8544,"ProviderGuid":"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx","ProviderName":"Legion-ContainerApps-ShoeboxProvider","Tenant":"northeurope-002","Tid":6684,"category":"ContainerAppConsoleLogs","location":"northeurope","operationName":"Microsoft.App/managedEnvironments/WRITE","properties":{"ContainerAppName":"xxxxxxx-microservice","ContainerGroupId":"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx","ContainerGroupName":"xxxxxxx-microservice--xxxxxxx-xxxxxxxxxx-xxxxx","ContainerId":"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx","ContainerImage":"","ContainerName":"daprd","EnvironmentName":"xxxxxxxxxxxxxx-xxxxxxxx","Log":"time="2024-03-05T08:04:56.352669302Z" level=info msg="HTTP API Called" app_id=dapr-microservice instance=xxxxxxx-microservice--xxxxxxx-xxxxxxxxxx-xxxxx method="GET /v1.0/metadata" scope=dapr.runtime.http-info type=log ver=1.11.6","RevisionName":"xxxxxxx-microservice--xxxxxxx","Stream":"stdout"},"resourceId":"/SUBSCRIPTIONS/XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/RESOURCEGROUPS/RG-XXX-XXX/PROVIDERS/MICROSOFT.APP/MANAGEDENVIRONMENTS/XXXXXXX-XXX-ENVIRONMENT-XXX","time":"2024-03-05T08:04:56.3520000Z"}

Is there any way to do the dissect of this field automatically (in the own integration) to be able to get all these data in separate fields for a proper dashboarding and alerting based on them? The content of the "message" field seems to be a normal .json format and should be easy for Elastic do the dissect, but I don't know why Elastic is injecting only the "message" field including all the content and not creating separate fields for each field of that content.

Many thanks in advance.

The Azure Event Hub input is a generic input to collect messages in event hub, it will not automatically parse your message.

You need to build a custom ingest pipeline to parse your json message after it is been collected by the integration.

Hi @juanmgarciaf

This working exactly as design but with simple ingest pipeline we can convert that message field which appears to be JSON to details.

With a generic event hub log elastic has no way of knowing What the content is how you want to parse it.

Couple questions first.

Just to be clear, are you using elastic agent?

Exactly which integrations? I think I know but I just want to be sure?

What Version of the stack are you on?

Hi Stephen,

Yes, I'm using elastic agent.

The integration that I'm using is "Azure Event Hub Input" because I have the data in the Event Hub, but I can only inject logs, not metrics, and I don't know why because I selected the option for moving all the metrics to the Event Hub. Do you know if is there any problem with the metrics injection from the Event Hub using that integration? If you need to inject metrics from the Event Hub using Elastic integrations, what integration/method would you use then?

The stack version that I have currently is 8.11.1.

Thanks for the response!

Hi @juanmgarciaf

One thing at a time... Back to your logs if you are interested

Have you upgraded to the latest Integration Version 1.9.2

  1. Did you enable the Parse Azure Logs setting

In short, to add more processing to an integration, you add a custom ingest pipeline as described here.

Kibana - Dev Tools Run this... it will add a custom pipeline that extracts the json from the message field

PUT _ingest/pipeline/logs-azure.eventhub@custom 
  "processors": [
      "json": {
        "field": "message",
        "target_field": "message_details",
        "ignore_failure": true

Why don't you see if that works? If you want to talk metrics, come back.

On Metrics which metrics, there are many built-in Azure metrics... but if they are custom, you may need to do some custom work.

Many thanks Stephen for your help! I will check the things that you say about logs as soon as I can, after your comments I think I will be able to parse the logs correctly.

Regarding metrics, I have a "Container Apps Environment" in Azure where I have created a new "Diagnostic Setting" selecting "All Metrics" checkbox for sending to an Event Hub. In this resource I have two "Container Apps" (with Metrics about CPU, Memory, Network, etc.) so these metrics should be sent to the EventHub correctly and I think I should be able to inject them in Elastic using an Integration. If I use the "Azure Event Hub Input" integration, I can see only logs, but not metrics.
Do you know why? If this is not possible using the "Azure Event Hub Input" integration, what other integration could I use to see the metrics of those Container Apps that are being moved to an Event Hub via "Diagnostic Setting" option?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.