Why is the value in the field on kibana repeated?

Update: I found the cause, the error was because I had two configuration files in the logstash folder and they had the same grok.

Thank you for following.

Hi

I use ELK version 7.13.1 on Ubuntu 18.04. I tried reading the log from the file https://s3.amazonaws.com/logzio-elk/apache-daily-access.log.
After creating the index and going back to the discovery tab on kibana, I see some fields with values ​​repeated twice. Raw log files are not duplicated.

I have checked in ELK 7.12.1 on CentOS7 & ELK 7.13.1 on CentOS 7 and I don't have this problem.

Picture

My file logstash:

input {
  file {
    path => "/var/log/apache-daily-access.log"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}
filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
  }
  date {
    match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
  }
  geoip {
    source => "clientip"
  }
}
output {
  elasticsearch {
    hosts => ["192.168.20.8:9200"]
  }
}

Please explain to me and guide how to solve this problem.

Thanks

This is really an elasticsearch question. See this blog post.

1 Like

Hi

I found the cause, the error was because I had two configuration files in the logstash folder and they had the same grok.

Thank you for following.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.