Is it possible to split a logstash file output into a random number of output files for throughput reasons?

Hi all.

We are writing to file at about 5000-20000 eps at times. logstash is able to do this fine. The reason we need to write to file and not an endpoint is because of a custom solution(that can't be changed) that reads the file and is bottlenecking.
has anyone had any luck splitting a file output into multiple files? ie say you originally output to a single file called "file",is there an easy way to roundrobin to 3 seperate files instead like file1, file2, file3? The

I have not tested it but you could do something like

filter { ruby { code => 'event.set("[@metadata][file]", Random.rand(3).to_i)' } }
output { file { path => "/some/path/file%{[@metadata][file]}" ... } }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.