Re: Filebeat > Kafka > Logstash > ElasticSearch

(Archelle Pagapulan) #1

Anyone here why beats data such as beat name, version etc is not included after indexing in elasticsearch?

(Magnus Bäck) #2

You're talking about the beat field and its subfields? It should be included. We can't say much without seeing your configuration and a sample event produced by a stdout { codec => rubydebug } output.

(Archelle Pagapulan) #3

thank you for your reply, magnus.

We have configured our filebeat to output in kafka only:

hosts: ["host1:9092","host2:9093","host32:9094"]
string: '%{[message]}{[beats]}'
topic: 'mytopic'

while our logstash config is :

input {
kafka {
topics => ["mytopic"]
bootstrap_servers => "host1:9092,host2:9093,host3:9094"

output {
elasticsearch {
hosts => [""]
index => "mycollection"

Filebeat stdout contains the beat field.
after indexing in elasticsearch, the beat field along its subfields are gone/does not exists.

(Magnus Bäck) #4

Please supply everything I asked for. The sample event is missing.

(Archelle Pagapulan) #6

Sorry if i forgot to include sample event.

But, i have found the solution.

Whenever we use filebeat to send event in logstash, its metadata is also included.
But, if the source comes from filebeat to kafka, by default, only the message field will be sent to kafka.
That is why we need to set the filebeat codec format and append the filebeat metadata such as input_type, filesource, .. etc.... see below config.

hosts: ["host:9092"]

string: 'beat=%{[beat]} offset=%{[offset]} filesource=%{[source]} input_type=%{[input_type]} start=%{[message]}'
topic: 'mytopic'

We can now parse these fields using grok filter in logstash.

(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.