hi there, I need to import data in elasticsearch with an upsert logic, for this I use the following logstash plugin
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "myindex-{+YYYY.MM.dd}"
document_id => "%{message_id}"
doc_as_upsert => true
action => "update"
}
}
which works on the daily index myindex-{+YYYY.MM.dd}
but this doesn't seem to work as the document to be update could belong to an index different from myindex-{+YYYY.MM.dd}
So what's the best approach to have the upsert search across multiple index, or an index pattern, and have it inserted on the latest index myindex-{+YYYY.MM.dd} ?
I tried this plugin
which I used to lookup the document and get its exact index name, but I was wondering if there is an easier way.
This is the template I use for the elasticsearch filter plugin
{
"size": 1,
"sort" : [ { "@timestamp" : "desc" } ],
"query": {
"query_string": {
"query": "_id:Bp4nynoBuY_R2Bpz1Xku"
}
},
"_source": []
}
which is referenced by the following filter in logstash
filter {
elasticsearch {
hosts => ["localhost:9200"]
index => "test*"
query_template => "template.json"
docinfo_fields => {
"_index" => "retrieved_index"
}
}
}
Any recommendations?
Thanks