Duplicate Events

Hello Forum,

I have the following setup (single node):

Logstash 2.1.0 + Elasticsearch 2.1.0 + Kibana4 and logstash receiving logs from different sources (nxlog, syslog forward and etc..). I am seeing exactly 9 duplicate events for the each logs received:

Jan 30 11:13:12 rhel sshd[9766]: Failed password for root from 192.168.1.1 port 43242 ssh2

The above log turns into 9 entries in elasticsearch. But I tried to send the output to "file" and there are no duplicates. I am having only issue with elasticsearch output. The following is the output config:

elasticsearch {
hosts => "localhost"
}

Please help me to solve this issue.

Thank you.

Regards,
Sathish.

What's the rest of the config look like?

Hello Warkolm,

If I store the data into a new index then I dont see any duplicates. I have created an index called "rhel7" and used the following config:

output {
if [type] == "rhel" {
elasticsearch {
hosts => "localhost"
index => "rhel7"
}

The index rhel7 has no duplicate values. Only the default index has the issue. The following the settings of the default index:

{"logstash-2016.02.01":{"settings":{"index":{"creation_date":"1454284536872","refresh_interval":"5s","number_of_shards":"5","number_of_replicas":"1","uuid":"CRbjaZ4WT_y9PtZMcqL56Q","version":{"created":"2010099"}}}}}

I am not sure why the default index has duplicate entries.