I have 3 logstash instance running on same machine and putting data to same ES index. all 3 are reading file input from same location (/etc/logstash/conf.d). If there are 500 files in that location than on ES index number of hits comes more than 500(happens due to same file got read by more than 1 instance~duplicate data on ES index). How I make sure that those 3 logstash instances share the files while reading and single file is read by just one instance?
Hello and welcome,
Can you provide a little more context?
First, why are running 3 instances of logstash on the same machine and how are you running them? Please share the command line you are using.
What do you mean with that? What fiels do you have in /etc/logstash/conf.d
? log files or configuration files? This path should have only configuration files, it should not have log files as this could lead to confusion.
You need to share your configurations.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.