Fields repeating same values


(maverick) #1

I have three indices which are sent across from Logstash to Elasticsearch. The logs are of the same format. The difference being they come from different servers and are received at different ports on logstash. The grok pattern for them is exactly the same.

The issue I face now is that every field in each of these three indices are tripling itself. When I remove one of the config files from logstash, the repetition becomes twice. If I add another file i.e four logstash configs , it quadruples. Really odd behaviour.

The "message" field in elasticsearch is normal but its only the fields that are multiplying. Any ideas why such odd behaviour?


(Christian Dahlqvist) #2

All config files in the directory are concatenation, so each event is processed by all filters and go to all outputs unless you control this using conditionals. Why are you creating an index per server? Why not put all the events in a single index?


(maverick) #3

So even if there is an input section in each config file listening on different ports, they all are concatenated?

There is only one output per config. So each config has an output section pointing to elasticsearch and a different index name. For example:

output {
elasticsearch {
hosts => [ "elasticsearch:9200" ]
index => 'index1-%{+YYYY.MM.dd}'
}
}

The reason I had to go for multiple indices for different servers was I noticed messages being dropped when they were all in one config.


(Christian Dahlqvist) #4

Yes.


(maverick) #5

Thanks. Any idea why messages can get lost. I am using tcp protocol. So rsyslog send the messages to logstash which then parses and sends it to elasticsearch. Noticed quite a lot of messages being lost hence tried this approach.


(Christian Dahlqvist) #6

No, I do not know why messages would get lost, but it would help if we could see the full configuration.


(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.