S3 output: Is there a way to have every event in its own file?

I have an unusual requirement to put every event in a separate file in an s3 bucket. I have tried to use this:

rotation_strategy => "size"
size_file => 1

but that is just an estimate and it still puts multiple events into the same file. I have also tried to use this:

tags => "%{[event_unique_id]}"

but that is not dynamic per-event, it just puts that string into every file name.

The only way I could get it to work is to set the prefix, i.e.

prefix => "%{[event_unique_id]}"

but that creates separate subfolders for each file and it decreases performance significantly.

Is there a way I can accomplish this?

No, there is not. The multi_receive_encoded method in the filter receives a batch of events. It processes them all and then calls the rotate_if_needed function. It deliberately only calls fstat once for the entire batch as an optimisation.

Ok, thanks for the quick response

Come to think of it, if you set pipeline.batch.size to 1 then you might get the result you want. However tiny batches are less efficient that regular batches (125 events). Not sure how much less.

That works @Badger but, as you suggested, performance is really bad as there could be tons of these events. I actually question the validity of this requirement and will go back to the users and try to change it. Thanks so much for giving me the solution!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.