Log duplicated in multiple indexes

Hi,

In my output section, i have multiple conditions. Each conditions alows me to route one type of log in the correct Elasticsearch indexe.

This is my logstash output :

> output {
>         if "nginx" in [tags] {
>                 if "_grokparsefailure" not in [tags] {
>                         elasticsearch {
>                                 hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
>                                 index => "logstash-isg-%{+YYYY.MM.dd}"
>                         }
>                 }
>         }
>         else if "scarlette" in [tags] {
>                 if "_grokparsefailure" not in [tags] {
>                         elasticsearch {
>                                 hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
>                                 index => "logstash-scarlette-%{+YYYY.MM.dd}"
>                         }
>                 }
>         }
>         else if "serveur_owncloud" in [tags] {
>                 if "_grokparsefailure" not in [tags] {
>                         elasticsearch {
>                                 hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
>                                 index => "logstash-owncloud-%{+YYYY.MM.dd}"
>                         }
>                 }
>         }
>         else if "brouette" in [tags] or "poussette" in [tags] {
>                 if "_grokparsefailure" not in [tags] {
>                         elasticsearch {
>                                 hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
>                                 index => "logstash-mta-%{+YYYY.MM.dd}"
>                         }
>                 }
>         }
>         else if "serveur_proxy" in [tags] or "serveur_dns" in [tags] {
>                 if "_grokparsefailure" not in [tags] {
>                         elasticsearch {
>                                 hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
>                                 index => "logstash-proxydns-%{+YYYY.MM.dd}"
>                         }

> #                       file {
> #                               path => "/var/log/LS-redis-flux.log"
> #                       }
>                 }
>         }
> }

The spécified indexes are created, and they do contain the log i want.

But Logstash continue to create it's default indexe "logstash-", and this indexe contain a copy of each log. One log is in "logstash-" and the other is in the correct indexe.

This seems strange to me, because i always specifies the name of the indexe. in my logstash output conditions. I don't know why this indexe is created and why logs are copied into it.

Try without the else clause and swap the grokparsefailure logic.

e.g.

output {
    if "_grokparsefailure" not in [tags] {
        if "nginx" in [tags] {
                elasticsearch {
                        hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
                        index => "logstash-isg-%{+YYYY.MM.dd}"
                }
        }

        if "scarlette" in [tags] {
                elasticsearch {
                        hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
                        index => "logstash-scarlette-%{+YYYY.MM.dd}"
                }
        }

        if "serveur_owncloud" in [tags] {
                elasticsearch {
                        hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
                        index => "logstash-owncloud-%{+YYYY.MM.dd}"
                }
        }

        if "brouette" in [tags] or "poussette" in [tags] {
                elasticsearch {
                        hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
                        index => "logstash-mta-%{+YYYY.MM.dd}"
                }
        }

        if "serveur_proxy" in [tags] or "serveur_dns" in [tags] {
                elasticsearch {
                        hosts => ["10.1.101.1", "10.1.102.1", "10.1.103.1"]
                        index => "logstash-proxydns-%{+YYYY.MM.dd}"
                }

#                file {
#                        path => "/var/log/LS-redis-flux.log"
#                }
        }
    }
}



Thanks for the reply,

Unfortunatly, this changes in the configuration file of Logstash, did not change anything...

Please state what versions of logstash and the elasticsearch output you are using.

I use Logstash 2.2.0

I dont know how to see the version of the logstash-output-elasticsearch plugin. When i'm in the /opt/logstash directory i just can install, uninstall, update, pack, unpack or list plugins. But when i list all the installed plugins, it just give me the name of plugins not the version....

use bin/plugin list --verbose

Thanks,

So i use logstash 2.2.0 and logstas-output-elasticsearch 2.4.1

When you run Logstash do you specify a file or a directory for the -f option?

    -f, --config CONFIG_PATH      Load the logstash config from a specific file
                                  or directory.  If a directory is given, all
                                  files in that directory will be concatenated
                                  in lexicographical order and then parsed as a
                                  single config file. You can also specify
                                  wildcards (globs) and any matched files will
                                  be loaded in the order described above.

I run Logstash as a service and i don't specify a file or directory, so by default it use my config file "logstash.conf" in the conf.d directory.

I have looked at the code for elasticsearch-output 2.4.1.
Hypothesis: it is retrying and the index get reset to the default on the second try

For this hypothesis to not be true, you should not see any log messages with this text in the log line retrying failed action with response code

Are you able to see the logstash logs? if so is there a warning level line with the above text in it?

Are you able to see the Elasticsearch logs? What does it show about the http bulk index request?
Can you see if the duplicates all come after all the originals were done or are they interleaved?

I'm not able to see the log line "retrying failed action with response code" , in the logstash log file. I haven't any warning log.

I whatched the Elasticsearch logs, and i didn't see any log about http bulk index request...

The duplicates logs come in the same time than the originals.