Archive raw logs and also output filtered logs

I'm trying to set up a pipeline that takes log input from Filebeat, outputs the raw logs to Amazon S3, AND filters the logs to then store in elastic. Is there currently a way to configure the s3 output plugin to only output the raw message, whilst still outputting filtered message to elastic?

For a workaround, my next thought was to have two filebeat prospectors running at the same location, with each prospector assigning different tags. Then I could split upon the tag to decide which gets filtered and sent to elastic, and which gets outputted to s3. However, there seems to be issues with running multiple prospectors at the same file, so this is likely not feasible.

Any advice on my situation would be very much appreciated!

Use a clone filter to split each input event in two. Process the copy as you like and send it to ES and route the original events to your s3 output.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.