Filebeat > Kafka > ES > Kibana

Hi folks,

I have a question regarding the Filebeat configuration. Because if I send the syslog data (with the syslog module) directly from Filebeat to Elasticsearch, the ready dashboard works (from the tutorial, see picture). Here, the process.name and the timestamp are not displayed in the message.

However, if I send the data from Filebeat into Kafka and then via Logstash into Elasticsearch, it does not.
(see figure, Kafka Host vs. ELS Host).

Important excerpts from filebeat.yml (from Kafka host, where Process.name cannot be displayed:

 output.kafka:
  enabled: true
  hosts: ["localhost:9092"]
  topic: "filebeats"
  partition.round_robin:
    reachable_only: false

  required_acks: 1
  compression: gzip
  max_message_bytes: 1000000

# ================================= Processors =================================
processors:
  - add_host_metadata:
  - add_process_metadata:

Important excerpts from logstash.conf (ship data from Kafka to Kafka to store normalized data)

input{
  kafka {
    client_id => input
    codec => json
    topics => ["filebeats"]
    bootstrap_servers => "10.250.19.13:9092"
  }
}
#To preserve @metadata fildes, use the Logstash mutate filter with the rename setting to rename the filds to non-internal fields.
filter {
  mutate {
    rename => {
      "@metadata" => "metadata"
    }
  }
}

output {
  kafka {
    client_id => output
    codec => json
    topic_id => filebeats_replay
    bootstrap_servers => "10.250.19.13:9092"
  }
}

Important excerpts from the second logstash.conf (ship normalized data to ELS)

input{
  kafka {
    client_id => input
    codec => json
    topics => ["filebeats_replay"]
    bootstrap_servers => "10.250.19.13:9092"
  }
}

#@metadata normally, but we renamed it before to metadata cause we chained it through kafka
output {
  elasticsearch {
    hosts => ["http://10.250.19.14:9200"]
    index => "%{[metadata][beat]}-%{[metadata][version]}-%{+YYYY.MM.dd}"
    user => "elastic"
    password => "changeme"
    manage_template => "false"
  }
}

I hope you can help me. Thank you very much! I look forward to your messages.
Best regards
Robsen

What is your input? You didn't share.

Are you using the filebeat syslog module or pointing directly to the log files?

The filebeat modules normally uses a ingest pipeline in elasticsearch to parse the message, but this works if you send directly to elasticsearch.

If you use logstash you need to use the pipeline option to tell elasticsearch which ingest pipeline it needs to use.

You need to do the extra step explained here in the documentation.

Thank you for your quick feedback.
And sorry, yes, I forgot to mention it. I use the syslog module.
I will try it out and get back to you!

I'm sorry that I'm getting back to you so late.
I still had to "fight" a bit, but it works now! Thanks!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.