Did you not find anything when you searched the forum?
Hello
Thanks you Christian for your interest.
I create 2 separate index, my 2 conf files are:
root@kvm:~# cat /etc/logstash/conf.d/auth.conf
input {
tcp {
port => "5001"
codec => json
tags => ["syslogauth"]
}
}
filter {
grok {
named_captures_only => false
break_on_match => true
match => { "message" => [" New session %{NUMBER} of user %{USERNAME:user}."," Accepted password for %{USERNAME:user} from %{IP:ip} port %{NUMBER} ssh2"," Failed passw…
All config files in the directory are concatenated into a single pipeline. This means that each event generated by an input plugin will go through all filters and be sent to all outputs (all 5 of them). You can get around this by creating a single config file with multiple inputs, use conditionals or use the relatively new multiple pipeline feature. This is a common misunderstanding so you should easily be able to find examples.
When I ingest document via logstash, I found the documents are created twice.
Setup:
Filebeat -> Logstash -> Elasticsearch
Result:
Same messages are ingested into Elasticsearch with different _id
So I tried the following testings:
Testing 1: Inget Document directly from Filebeat
i.e. Filebeat -> Elasticsearch
Result: 1 document is created
Finding: the issue should be related to logstash or elasticsearch
After reading the article: https://www.elastic.co/blog/logstash-lessons-handling-du…