I have following filter configuration in my logstatsh pipeline. What it does is, at the start of the event first filter creates a CSV file with header and sets the file name to metadata. Second filter writes the output to above CSV.
The challenge (or) requirement I have is.
-
Every Xseconds, we need to create new CSV file and write to that file. I am not ruby expert and couldn't get any clues from Google search. Can someone please advise?
-
Right now this file is being created in logstash installed folder --> bin folder. Instead is there any way I can specify custom location?
Appreciate any help!
filter {
ruby {
init => "
begin
randval = (0...8).map { (65 + rand(26)).chr }.join
@csv_file = 'output'+randval+'.csv'
csv_headers = ['YYYY-MM-ddTHH:mm:ss.SSSZ','Log Level','Event ID']
if File.zero?(@csv_file) || !File.exist?(@csv_file)
CSV.open(@csv_file, 'w') do |csv|
csv << csv_headers
end
end
end
"
code => '
event.set("[@metadata][suffix]",@csv_file)
'
}
}
output {
file {
path => "output.log"
}
csv {
fields => [ "created", "level", "code"]
path => "%{[@metadata][suffix]}"
}
}