Logstash upsert across multiple indexes

hi there, I need to import data in elasticsearch with an upsert logic, for this I use the following logstash plugin

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "myindex-{+YYYY.MM.dd}"
    document_id => "%{message_id}"
    doc_as_upsert => true
    action => "update"
  }
}

which works on the daily index myindex-{+YYYY.MM.dd}

but this doesn't seem to work as the document to be update could belong to an index different from myindex-{+YYYY.MM.dd}

So what's the best approach to have the upsert search across multiple index, or an index pattern, and have it inserted on the latest index myindex-{+YYYY.MM.dd} ?

I tried this plugin

which I used to lookup the document and get its exact index name, but I was wondering if there is an easier way.

This is the template I use for the elasticsearch filter plugin

{
  "size": 1,
  "sort" : [ { "@timestamp" : "desc" } ],
  "query": {
    "query_string": {
      "query": "_id:Bp4nynoBuY_R2Bpz1Xku"
    }
  },
  "_source": []
}

which is referenced by the following filter in logstash

filter {
   elasticsearch {
      hosts => ["localhost:9200"]
      index => "test*"
      query_template => "template.json"
      docinfo_fields => { 
        "_index" => "retrieved_index"
      }
   }
}

Any recommendations?
Thanks

I initially thought you may be able to alias the index but then I remembered that you cannot specify multiple write indexes.

You cannot write to multiple indices and update a whole bunch at the same time.

Updating an index is an expensive process as it takes 3x the I/O (read, delete, write) vs (write) and is generally not what Elasticsearch should be used for.

Your solution to using the Elasticsearch filter to find the index is probably the best solution, but now you're doing 4x the I/O (read, read, delete, write). So as long as you are OK with that then you probably have the only solution that I can think of.