Create new file every 60seconds in Logstash CSV output

I have following filter configuration in my logstatsh pipeline. What it does is, at the start of the event first filter creates a CSV file with header and sets the file name to metadata. Second filter writes the output to above CSV.

The challenge (or) requirement I have is.

  1. Every Xseconds, we need to create new CSV file and write to that file. I am not ruby expert and couldn't get any clues from Google search. Can someone please advise?

  2. Right now this file is being created in logstash installed folder --> bin folder. Instead is there any way I can specify custom location?

Appreciate any help!

filter {
    ruby {
	   init => "
				randval = (0...8).map { (65 + rand(26)).chr }.join		
				@csv_file = 'output'+randval+'.csv'
                csv_headers = ['YYYY-MM-ddTHH:mm:ss.SSSZ','Log Level','Event ID']
                if || !File.exist?(@csv_file)
          , 'w') do |csv|
                        csv << csv_headers
        code => '

output {
   file {
      path => "output.log"
   csv {
       fields => [ "created", "level", "code"]
       path => "%{[@metadata][suffix]}"

You can specify the path in the open file
Use epoch div seconds to keep instead of random value

path = '/path/I/want/the/file/'
id = / seconds_to_change
@csv_file = 'output' + id + '.csv' + @csv_file)
1 Like

Thank you! Is there any way to create new file only like let us 30seconds? Right now it is creating new file as soon as it has "events" from input (winlogbeat).

I tried to configure winlogbeat "queue" settings to send data only every 30seconds, but that seems not working as expected (I see data like every 1second).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.