Azure Activity Log to Logstash

I want to have the Azure Monitor-Activity Log data piped into Elasticsearch, and using the Logstash Plug-in found here ( I have it configured and am able to see the log data, but it appears to be missing some of the detail.

Azure's Monitor-Activity log has the ability to view the JSON data, and from that I can see the operationName formatted like "operationName":{"value": "Microsoft.Resources/Deployments/Write", "localizedValue": "Create deployment"}.

When i read the data into ELK using the logstash plugin and see it in Kibana, the message has json data with operationName flattened to just the string value, i.e. "Microsoft.Resources/Deployments/Write", and the more user friendly localizedValue is absent.

I'm not sure if it's coming from the Event Hub without the localizedValue or not. I'm on a free trial tier account with azure so I can't easily send it to the blob store or clear lake storage to examine closer. I hope to to talk to someone from Microsoft later in the week.

Anybody have luck getting the localizedValue fields into elk from the Azure Activity logs?


Michael H.

Here is the logstash config i'm using with that filter:
event_hub_connections => ["Endpoint=sb://;SharedAccessKeyName=SAS_bar;SharedAccessKey=****;EntityPath=insights-operational-logs"]
threads => 4
decorate_events => true
consumer_group => "$Default"
filter {
json {
source => "message"
split { field => 'records' }
date { match => [ "[records][time]", "HH:mm:ss.SSS" ] }
output {
elasticsearch {
index => "insights-operational-logs-%{+YYYY.MM.dd}"
hosts => [ "" ]

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.