Logstash ruby filter doesn't parse json and return whole message field in Kibana

Hi ,

I have json log file need to parse like below.
{"DP":"dbpool3","CMID":"TALSCMCherie","CMN":"Siemens","UID":"cgrant1","UN":"cgrant","PUID":"lokamoto1","UL":"en_GB","CIP":"192.168.56.1","SN":"qacandrot_TALSCMCherie","DC":"DC04","CLN":"TalentSearchController","MID":"SCM","PID":"TalentSearch","PQ":"v11","AC":"TalentSearch","SCM.TS.TS.IIN":"true","SCM.TS.TS.MACO":"false","SCM.TS.TS.COND":["KC","BC","PC","FC","RC"],"SCM.TS.TS.BC":["age","fax","ethnicity"],"SCM.TS.TS.PC":["achievements","languages"],"SCM.TS.TS.FC":["department","location"],"SCM.TS.TS.RC":["sysOverallPotential","sysOverallCustom1"],"SCM.TS.TS.NR":150}

My expected log displayed in kinaba after logstash filter parse is

"CMID":"TALSCMCherie",
"CIP":"192.168.56.1",
"SN":"qacandrot_TALSCMCherie",
"PID":"TalentSearch",
"AC":"TalentSearch",
"SCM.TS.TS.COND": "KC","BC","PC","FC","RC",
"SCM.TS.TS.BC": "age","fax","ethnicity",
"SCM.TS.TS.PC": "achievements","languages",
"SCM.TS.TS.FC":"department","location",
"SCM.TS.TS.RC":"sysOverallPotential","sysOverallCustom1",
"SCM.TS.TS.NR":150

I am using ruby filter to parse the json, here is the logstash.conf
input {
file{
path => "C:\elkstack\elasticsearch-6.5.1\logs\kv.log"
start_position => "beginning"
sincedb_path => "null"
}

}

filter {

ruby {
       code => " event.get('message').each { |kv| 
				                                if kv['value'].is_a?(Array) 
                                                    event.set(kv['key'], kv['value'].join(', ')) 
                                                else
                                                    event.set(kv['key'], kv['value']) 
                                                end 
										    }"

      }

}

output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "logstash-talentauditkv"
}

 stdout {
  codec => rubydebug
 }

}

But in kibana, the message field is not parsed at all. Anyone can help take a look at my ruby filter and ruby code , where is the problem? By the way, json filter can parse it correctly.

So, you basically just want to drop some fields? As the input is already JSON I would read it in as JSON and just drop the fields in the filter section.

So set codec to JSON per https://www.elastic.co/guide/en/logstash/6.5/configuration-file-structure.html#codec

The config would look something like

input {
  file {
    path => "C:\elkstack\elasticsearch-6.5.1\logs\kv.log"
    start_position => "beginning"
    sincedb_path => "null"
    codec => "json"
  }

}

filter {

    mutate {
      remove_field => [ "[foo]" ]
    }
}
...

Oh, you want to make Arrays into Strings as well... That will need something else...

Maybe this will help

Hi A_B,

Thanks for the reply. Json filter does works. But I am trying to find a solution by ruby filter. do you have any advise why I ruby code doesn't work??

Thanks,
Cherie

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.