Problem with "host" field when parsing linux system logs with Filebeat pipeline

When i parse linux auth.log, i get error (in Kibana):
error.message: cannot set [hostname] with parent object of type [java.lang.String] as part of path [host.hostname]
It looks like

I ingest logs in Elastic from existing collector, so i use pipeline. Config is:

input {
      udp {
        port => 10002
        codec => line

    filter {
      if "linux_secure" in [tags]{
        mutate {
          add_field => { "[@metadata][pipeline]" => "filebeat-7.2.0-system-auth-pipeline" }

    output {
      if "linux_secure" in [tags]{
        elasticsearch {
          pipeline => "%{[@metadata][pipeline]}"
          hosts => ["elk-1:9200"]
          index => 'filebeat-linux_secure-12-%{+YYYY.MM.dd}'

Could you please tell me how can i debug the problem?Some steps.


Is this your full filebeat.yml? You're using a default udp input with no processors, etc.? The error looks like an index configuration issue, as though the standard field host.hostname is failing to index because the index template thinks that host is a string. Can you check your index templates? (e.g. run GET /_template/filebeat* in the kibana console)

I think i found a problem - udp plugin add field host by default and breaks ECS.
Same issue Move the `host` field to a location in line with ECS or make it configurable · Issue #42 · logstash-plugins/logstash-input-udp · GitHub
Very bad behavior.
So, i think i found the solution - now (since this commit Make source IP field configurable by praseodym · Pull Request #43 · logstash-plugins/logstash-input-udp · GitHub) there is an option in UDP plugin.
Udp input plugin | Logstash Reference [7.15] | Elastic
So, i add a string to udp input block:
source_ip_fieldname => "host.ip"
Now it works, but i dont sure is this field name is best solution.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.