Hi Team,
I have below CSV data which are exported from ES,
when,module,fa,ui,at,ci,cn,link
"April 23rd 2019, 09:13:31.562",SCM,MR,cgrant1,SCM.MR.LOAD_FILTERS,TALSCMK,TALSCMK,1
"April 23rd 2019, 22:38:31.868",SCM,TS,101031,SCM.TS.LIST_SAVED_SEARCH,CQAAbby,CQAAbby,1
"April 23rd 2019, 22:38:33.700",SCM,TS,101031,SCM.TS.START_OVER,CQAAbby,CQAAbby,1
"April 23rd 2019, 22:38:42.015",SCM,TS,101031,SCM.TS.SEARCH,CQAAbby,CQAAbby,1
"April 24th 2019, 01:02:42.097",SCM,TS,100253,SCM.TS.COMPARE,CQAAbby,CQAAbby,1
"April 24th 2019, 01:03:01.850",SCM,TS,100253,SCM.TS.SEARCH,CQAAbby,CQAAbby,1
"April 24th 2019, 01:03:09.915",SCM,TS,100253,SCM.TS.COMPARE,CQAAbby,CQAAbby,1
"April 24th 2019, 01:03:34.538",SCM,TS,100253,SCM.TS.SEARCH,CQAAbby,CQAAbby,1
...
...
and I want get a sum based on different at values, the expected output should be
at,sum
SCM.MR.LOAD_FILTERS,10
SCM.TS.LIST_SAVED_SEARCH,100
SCM.TS.START_OVER,2
SCM.TS.SEARCH,120
...
...
This is the configuration file of logstash, but the there's nothing in file "funneldata.csv" created by logstash.output.csv
filter { csv { columns => [ "when",
"module",
"fa",
"ui",
"at",
"ci",
"cn",
"link"]
separator => ","
skip_header => "true"}
aggregate {
task_id => "%{at}"
code => "map['action'] ||= ''
map['action'] = event.get('at')
map['sum'] ||= 0
map['sum'] += 1"
push_map_as_event_on_timeout => true
timeout_task_id_field => "at"
inactivity_timeout => 30
timeout_code => "event.set('action', event.get('at'))
event.set('sum', event.get('sum'))"}}
output { csv { fields => [ "action", "sum" ]
path => "C:/elkstack/elasticsearch-6.5.1/logs/funneldata.csv"
csv_options => { "headers" => true}}
stdout { codec => rubydebug }}
how to correct the filter for getting the expected output file??