Split json fields of similar type in logstash

I have a json file converted from pcap using tshark. There are different layers inside the json and each layer has different fields. Some fields are repeating inside the same message for example "bicc_bicc_cic" as shown below.
How i can separate them as unique so that i can search for the values in discover??

"layers":{
"bicc":[{
		"bicc_bicc_cic":"22240",
		"bicc_bicc_cic":"22763",
		"bicc_bicc_cic":"90"

i tried ruby code but it is not working

<
ruby {
code => '
val_a = [ ]
event.get("[layers][bicc][bicc_bicc_cic]").each { |k, v|
v["temp"] = k
val_a << v
}
event.set("val_a", val_a)
'
}
if [val_a] {
split {
field => "val_a"
}

That should be event.get("[layers][bicc][0]").each. However, I do not think this will work.

JSON allows duplicate keys, although it is recommended not to do that. In both Java and Ruby the JSON will be parsed into a hash, and the duplicate keys will overwrite oneanother in the json codec/filter before it gets to the ruby filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.