I have following pipeline (example, not full pipeline) where I need count the number of events for EVERY file I am writing as output. For example, this pipleline creates new file every 30 seconds with the events processed in that 30 seconds. This code is sort of working except two issues:
- Writing each trap_id as an entry in out put file (I just want ONE entry with final count) -- Though I am doing just event count, my end goal is, count different log_level events, for example, how many "Error", "Information" windows events etc.,
- When I reset the trap_id counter, the first entry in output file starting with ZERO instead of ONE.
Can someone please advise how I can address these issues?
Posted same question on SO too.
input {
beats {
port => 5045
}
}
filter
{
ruby {
init => '
@trap_id = 0
@lasttimestmp = 0
'
code => '
evnttme = event.get("[@metadata][ms]")
if @lasttimestmp == evnttme
@trap_id += 1
event.set("lsttimestmp", @lasttimestmp)
event.set("trap_id", @trap_id)
else
@trap_id = 0
@lasttimestmp = evnttme
event.set("lsttimestmp", evnttme)
event.set("[@metadata][ms]", evnttme)
event.set("trap_id", @trap_id)
end
'
}
}
output {
file {
path => "output.log"
}
file {
flush_interval => 30
codec => line { format => "%{[@metadata][ts]}, %{[trap_id]}"}
path => "C:/lgstshop/local/csv/output%{[@metadata][ms]}.csv"
}
}