Copy speciifc data to another index automatically

Hello!

Currently I use FIlebeat to send data to Logstash and then to Elasticsearch. There are mostly application logs that contain a lot of data. Everything is sent to an index called filebeat--0000x and is rolled over and kept for 3 months.

Sometimes there are very important logs that need to saved for much longer and it would be better to save them in a separate index that has a completely different ILM policy.

My first idea is to write a script that would query specific results, check if a "_docid" exists in the new index and if not, then copy the document over and remove unneeded fields. This would create an index that has only the needed data and would not take up much disk space.

Maybe there is a much more elegant way to "copy" only needed data to a separate index?

Maybe a Transform can help you do that.

Check the documentation about transforms.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.