We're looking to configure Azure module | Filebeat Reference [7.17] | Elastic and I had a question.
Our filebeats is outputing to logstash which then in turn writes out to Graylog. I have filebeat setup to gather logs from an Azure eventhub and the logs are being gathered, but the parsing doesn't seem to be happening. The message field in Graylog contains all the fields from Azure but not parsed out.
Is this because filebeat isn't using an elastic pipeline to parse events?
The parsing is done in Elasticsearch, it uses an ingest pipeline.
If you are sending to Logstash, you need to configure your logstash output to use the ingest pipeline in Elasticsearch, you can do that adding the option
pipeline => "pipeline-name" in your Elasticsearch output.
Thanks. What if the logstash sends out through the gelf format and not directly to Elasticsearch?
I don't understand your last question, can you give more context?
Send what to where in the Gelf format?
In our flow we don't have Elasticsearch output in the mix:
filebeat -> logstash - Graylog (out_gelf)
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.