My Logstash config file that I'm using currently uses a split filter to break the event into several different events. These are then sent to ElasticSearch to store as documents.
The question is how do we maintain the index after the initial ingestion? Currently the document id is the primary key coming from a database. We're looking at just appending an event split number to the primary key, then using that as the document id.
Eg. Item 001 might be split into 3 events, and thus we would ingest them with document ids 001-1, 001-2 and 001-3.
Any recommendations on how to do this, or as to a better solution?