Update ElasticSearch existing documents with new fields

Hello,

FileBeat 6.6.2
LogStash 6.6.2
ElasticSearch 5.6.10

I am trying to combine data from 2 different log files into the same document in ElasticSearch.
I read each log file with a separate instance of FileBeat, both sending output to the same LogStash instance.
Each FileBeat config file sets a different value for custom field

fields: {log_type: blah}

In that LogStash instance, I process the input differently based on the value of [fields][log_type] and, in both cases, its sends the output to ElasticSearch, with the same index for related pieces of data.

Each processing section updates different fields in ElasticSearch, except the "index" and the "document_id".

However, what I see in Kibana monitor is that different documents are being created. They have the same value of field "_id", but they haven't been merged.

I have this in my elasticsearch output plugin

action => "update"
doc_as_upsert => true

What am I missing here?
Could it be because the data from the second log file is being read just a few seconds after reading the first one?

replying to myself. As I was building the index based on some timestamp fields from the logs, and each log has a different format for it (mm.dd.yy vs yy.mm.dd), I didn't realize I was updating same document_id but for different indexes :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.