Using 7.17.. currently I read log files in with filebeat or filebeat/logstash saving in ES.
Log lines are dissected and we have a field called loglevel, for ERROR, WARN and INFO. If the message is an ERROR a field called event gets tagged on the message and its saved in ES. Data is saved in a data stream, and daily new indexes are created. These indexes are deleted after 7 days.
I would like different data retention policies based in the loglevel field value, anything at ERROR I would like to keep for 30 days, anything else discard after a week.
Currently to do this I ingest the data to my main index, and once a day I have a logstash node query this index and extract anything with the event field/loglevel=error and save it to a new index which has a different data retention policy. This feels more complicated then I want it.
Is there a way to have filebeat send documents to both indexes? Or perhaps an ES pipeline that would allow me to save to both places at time of ingestion? (I would like to save the complete document
Interested in ideas from someone who has had a similar issue.