Sending azure_event_hub messages to elasticsearch


(Nick) #1

I recently installed Elastic Stack for the first time using the steps at Digital Ocean. I then followed the steps at the Azure Event Hubs plugin page to link logstash to an Azure event hub. This seemed to go okay, but nothing is showing up in kibana or even elasticsearch as far as I can tell.

My config looks like this:

input {
  azure_event_hubs {
    config_mode => "advanced"
    event_hubs => [
      { "myentitypath" => {
        event_hub_connection => "Endpoint=sb://myendpoint.servicebus.windows.net/;SharedAccessKeyName=logstash;SharedAccessKey=alongrandomkey;EntityPath=myentitypath"
      }}
    ]
    threads => 8
    decorate_events => true
    consumer_group => "$Default"
    storage_connection => "DefaultEndpointsProtocol=https;AccountName=mystorageacct;AccountKey=alongrandomkey;EndpointSuffix=core.windows.net"
   }
}

I feel like I probably need to specify some output details? The only output config that I have is what I set up for filebeats:

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    manage_template => false
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  }
}

Is there an output format that I need to specify for Azure Event Hub? Do I need to somehow create the indexes in ES?

Any help is much appreciated. As you can tell I don't really understand how all this works. :slight_smile:


(Nachiket) #2

Hi Nick,

Could you verify if you are receiving the information on STDOUT?

Try using the following config:

input {
  .... Your Input Settings .....
}
output {
  stdout { }
}

This will validate if the input configuration is working, then we can look for error on the output. A typical elasticsearch output looks as follows:

output {
    elasticsearch {
        action => "index"
        hosts => ["http://x.x.x.x:9200" ]
        index => "indice"
    }
}

I believe you are missing the action part in the elasticsearch config.


(Nick) #3

Thanks for the reply and very helpful stdout config! Using that I am able to see that the azure event hub data is definitely making it to logstash:

"@timestamp" => 2019-02-08T14:04:32.149Z,
   "message" => "{\"records\": [{ \"LogicalServerName\": \"myservername\", \"SubscriptionId\": \"mysubuuid\", \"ResourceGroup\": \"myRG\", \"time\": \"2019-02-08T14:00:21.7540000Z\"...

So next I tried adding your output block verbatim and nothing happens. I assume that's because "indice" needs to be an actual indice and not just the word, but how do I know which indice? And how do I create the indices that azure_event_hub expects? Filebeats has a handy tool that does it, but I don't see anything similar for this plugin.


(Nick) #4

Does anyone have any solutions to this? It seems odd that the documentation tells me how to ingest the data, but not how to display it. Are there any other docs someone can point me to that might help?