Logstash saving data into two indices (Duplicated Data)

Good day,

I am currently processing the data from filebeat's system module using Logstash. The data provided by filebeat's system module (syslog and auth) is properly parsed but unfortunately it was saving the structured documents into two indices at the same time:

E.g.
filebeat-7.6.2-2020.11.15
filebeat-system-7.6.2-2020.11.16

This was due to a conflict I had in the output section of logstash where I had the following statements:

output{
  if condition {output}
  if condition {output}
  else {output}
}

Filebeat's output configuration is enabled for Logstash only and Disabled for Elasticsearch. Also, none of the following settings are defined in filebeat's configuration:

setup.template.name:
setup.template.pattern:
output.elasticsearch.index:

I updated my logstash output configuration to the one stated below which solved this issue:

output {
  if [event][module] == "system" {
    elasticsearch {
      user => 'logstash_writer'
      password => 'my_dummy_password'
      hosts => ["http://elasticsearch-1:9200","http://elasticsearch-2:9200","http://elasticsearch-3:9200","http://elasticsearch-4:9200"]
      cacert => '/etc/logstash/config/certs/logstash.pem'
      index => "%{[@metadata][beat]}-%{[event][module]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
      #index => "divya-%{[@metadata][beat]}-%{+YYYY.MM.dd}"
      document_type => "%{[@metadata][type]}"
    }
  }
  else if [event][module] == "mysql" {
    elasticsearch {
      user => 'logstash_writer'
      password => 'my_dummy_password'
      hosts => ["http://elasticsearch-1:9200","http://elasticsearch-2:9200","http://elasticsearch-3:9200","http://elasticsearch-4:9200"]
      cacert => '/etc/logstash/config/certs/logstash.pem'
      index => "%{[@metadata][beat]}-%{[event][module]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
      #index => "divya-%{[@metadata][beat]}-%{+YYYY.MM.dd}"
      document_type => "%{[@metadata][type]}"
    }
  }
  else if ![event][module] {
    elasticsearch {
      user => 'logstash_writer'
      password => 'my_dummy_password'
      hosts => ["http://elasticsearch-1:9200","http://elasticsearch-2:9200","http://elasticsearch-3:9200","http://elasticsearch-4:9200"]
      cacert => '/etc/logstash/config/certs/logstash.pem'
      index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
      #index => "divya-%{[@metadata][beat]}-%{+YYYY.MM.dd}"
      document_type => "%{[@metadata][type]}"
    }
  }
}

Any suggestions on how I could improve this solution to avoid future issues?

You could use pipellines.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.