Duplicate Events

Hello Forum,

I have the following setup (single node):

Logstash 2.1.0 + Elasticsearch 2.1.0 + Kibana4 and logstash receiving logs from different sources (nxlog, syslog forward and etc..). I am seeing exactly 9 duplicate events for the each logs received:

Jan 30 11:13:12 rhel sshd[9766]: Failed password for root from port 43242 ssh2

The above log turns into 9 entries in elasticsearch. But I tried to send the output to "file" and there are no duplicates. I am having only issue with elasticsearch output. The following is the output config:

elasticsearch {
hosts => "localhost"

Please help me to solve this issue.

Thank you.


What's the rest of the config look like?

Hello Warkolm,

If I store the data into a new index then I dont see any duplicates. I have created an index called "rhel7" and used the following config:

output {
if [type] == "rhel" {
Elasticsearch {
hosts => "localhost"
index => "rhel7"

The index rhel7 has no duplicate values. Only the default index has the issue. The following the settings of the default index:


I am not sure why the default index has duplicate entries.