How to parse json data in ruby filter?

Hi,

I am having one json file and I am trying to parse those json field in logstash. so I decided to create one ruby file from that I have to parse those json fields. is there any way to do this process?

for example,
Json file looks like this,

{
"team":
{
"team1":
{
"team_name" : "xxx",
"team_count" : "10"
},
"team2":
{
"team_name" : "yyy",
"team_count" : "10"
},
"team_n":
{
....
}
}
}

And I have to parse those json file field in dynamic way. which means I have to write only one logstash configuration file and it should be working with all cases. if I add anything in json file, the logstash file will work with that case.
Thanks.

It's not clear why you would have to use a ruby filter for this.

If the JSON file really looks like that (i.e. the JSON object is spread over multiple lines) then that's going to be the first hurdle to overcome.

What does your configuration look like now? What does an example event produced by a stdout { codec => rubydebug } output look like?

If I am using kv filter, I have to mentioned all the values statically. but I don't want to do this because my input values are taken from the json file . and this file is contain dynamic values. If I change or add anything in json file, I don't want to change that config file too.

If I am using kv filter, I have to mentioned all the values statically.

No, that's not true. Perhaps you're thinking of the csv filter.

but I don't want to do this because my input values are taken from the json file . and this file is contain dynamic values. If I change or add anything in json file, I don't want to change that config file too.

Use a json or json_lines codec or a json filter to parse JSON data.

I am not clear with these method. I have separate json file and I am getting all the values from that file. I have to parse those json files when the 'message' contains some specific string I have to parse only those type of json values.
for example,
json file,
{
"index_name" : "sample_index-%{+YYYY.MM.dd}",
"host" : "localhost",
"fields" : "example-query, testing-process",
"example-query" : {
"keywords" : ["processid","date"],
"f_split" : ", ",
"v_split" : "="
},
"testing-process" : {
"isSearchKey" : "false",
"keywords" : ["processvalue","activitydate"],
"f_split" : ";",
"v_split" : ":"
}
}

from this json example, If my message field contains "fields" : "example-query, testing-process", any of this one fields value it will take the respective values from json file. from that value I have to parse those values.

You'll always parse the whole JSON string, but if you afterwards don't want to keep all fields then they can be cleaned up.

Please provide the configuration and example event I asked for earlier.

This is my logstash config file,

input {
file {
path => "/logs/*"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter {
mutate {
add_field => {
"fields_sep" => "{{ fields }}"
}
}
ruby {
code =>
" find_value = event['fields_sep'].split(',')
find_value.each_index {
|i| event['find_value[i]'] = find_value[i]
}
"
}

}

output {
elasticsearch {
hosts=> ["localhost:9200"]
index => "{{ index_name}}"
}
stdout
{
codec=>rubydebug
}
}

Json file:

{
"index_name" : "sample_index-%{+YYYY.MM.dd}",
"host" : "localhost",
"fields" : "example-query, testing-process",
"example-query" : {
"keywords" : ["processid","date"],
"f_split" : ", ",
"v_split" : "="
},
"testing-process" : {
"isSearchKey" : "false",
"keywords" : ["processvalue","activitydate"],
"f_split" : ";",
"v_split" : ":"
}
}

I want to add fields with the respect of particular json parts which one match with message line.

Last chance: What does an example event produced by a stdout { codec => rubydebug } output look like?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.