Hi ,
I have json log file need to parse like below.
{"DP":"dbpool3","CMID":"TALSCMCherie","CMN":"Siemens","UID":"cgrant1","UN":"cgrant","PUID":"lokamoto1","UL":"en_GB","CIP":"192.168.56.1","SN":"qacandrot_TALSCMCherie","DC":"DC04","CLN":"TalentSearchController","MID":"SCM","PID":"TalentSearch","PQ":"v11","AC":"TalentSearch","SCM.TS.TS.IIN":"true","SCM.TS.TS.MACO":"false","SCM.TS.TS.COND":["KC","BC","PC","FC","RC"],"SCM.TS.TS.BC":["age","fax","ethnicity"],"SCM.TS.TS.PC":["achievements","languages"],"SCM.TS.TS.FC":["department","location"],"SCM.TS.TS.RC":["sysOverallPotential","sysOverallCustom1"],"SCM.TS.TS.NR":150}
My expected log displayed in kinaba after logstash filter parse is
"CMID":"TALSCMCherie",
"CIP":"192.168.56.1",
"SN":"qacandrot_TALSCMCherie",
"PID":"TalentSearch",
"AC":"TalentSearch",
"SCM.TS.TS.COND": "KC","BC","PC","FC","RC",
"SCM.TS.TS.BC": "age","fax","ethnicity",
"SCM.TS.TS.PC": "achievements","languages",
"SCM.TS.TS.FC":"department","location",
"SCM.TS.TS.RC":"sysOverallPotential","sysOverallCustom1",
"SCM.TS.TS.NR":150
I am using ruby filter to parse the json, here is the logstash.conf
input {
file{
path => "C:\elkstack\elasticsearch-6.5.1\logs\kv.log"
start_position => "beginning"
sincedb_path => "null"
}
}
filter {
ruby {
code => " event.get('message').each { |kv|
if kv['value'].is_a?(Array)
event.set(kv['key'], kv['value'].join(', '))
else
event.set(kv['key'], kv['value'])
end
}"
}
}
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "logstash-talentauditkv"
}
stdout {
codec => rubydebug
}
}
But in kibana, the message field is not parsed at all. Anyone can help take a look at my ruby filter and ruby code , where is the problem? By the way, json filter can parse it correctly.