Hello,
I'm using Logstash 2.2.2 and Elasticsearch 2.2.0.
I parse logs and put it into Elasticsearch.
Log lines contain transactionId, so I wanted to use "update" action of elasticsearch output and put transactionId as document_id.
It works fine - final entry contains all the fields defined in logstash.
However if there are fields that exist in every log line ("message" field in my case) it is overwritten.
Is there an option to make Elasticsearch create an array of messages and add upcoming ones instead of overwriting the existing value?