Json array splitting in logstash

Hi,

We have following kind of logs in json files.

{"info":{"tstmp":1.2,"from":"avshdd","hostid":"jahgcjha","log":{"version":"jhasgc","id":"jsadh","jobname":"xzcj","lognm":{"class":"jscjks","msg":[{"users":"kjdfk","commands":"jhscjhs","pririty":"jhasj","host":"kjsjcksjd"},{"users":"kjdfkxc","commands":"jhscjhsdsf","pririty":"jhasjdd","host":"kjsdfsjcksjd"}],"severity":"info"}},"mtype":"kjsjs"}}

We tried splitting arrays in it. But no luck. We are new in parsing json. Following is the logstash config we are using:

input
{
file
{
path => ["/var/log/validjson3.log"]
type => "json"
codec => "json"
sincedb_path => "/dev/null"
start_position => "beginning"
}
}

filter
{
split { field => "[info][log][lognm]lmsg" }
}

output {
elasticsearch {
hosts => ["xyz:9200"]
sniffing => false
manage_template => false
user => user
password => passwd
index => "json-%{+YYYY.MM.dd}"
}
stdout {
codec => rubydebug
}
}

While using this config we got following error:

Please let us know if we are missing something in logstash config.

Regards,
Shweta

split { field => "[info][log][lognm]lmsg" }

"lmsg"? Try this:

 split { field => "[info][log][lognm][msg]" }

Thank you Magnus. That was typing mistake.

We have following nested array structure:

{
"info": {
"tstmp": 1.2,
"from": "avshdd",
"hostid": "jahgcjha",
"log": {
"version": "jhasgc",
"id": "jsadh",
"jobname": "xzcj",
"lognm": {
"msg": {
"groups": [{
"name": "asg",
"gid": 0
}],
"user": [{
"users": "kjdfk",
"commands": "jhscjhs",
"pririty": "jhasj",
"host": "kjsjcksjd"
}, {
"users": "kjdfkxc",
"commands": "jhscjhsdsf",
"pririty": "jhasjdd",
"host": "kjsdfsjcksjd"
}]
},
"severity": "info"
}
},
"mtype": "kjsjs"
}
}

There are two nested arrays in "msg" named "greops" and "user".

Please help us with filter for it.

Thanks in advance.

Regards,
Shweta

Please show the wanted outcome for the given example event.

@magnusbaeck: We want fields to be seen in kibana like following:

info.log.msg.groups.name
info.log.msg.groups.gid

info.log.msg.user.users
info.log.msg.user.commands
info.log.msg.user.pririty
info.log.msg.user.host

Thanks and Regards,
Shweta

Please be more explicit. What do the wanted event JSON object(s) look like?

@magnusbaeck

We have provided following filter configuration:

filter
{
split { field => "[info][log][lognm][msg]groups" }
}

which gives us following result in Kibana

We want user and groups array to be parsed on same level. We are not able to provide filter configuration for same level arrays in Json.

Thanks and regards,
Shweta

You are not answering my question. I'm not asking what you currently have. I'm asking what you want. Last chance: What do the wanted event JSON object(s) look like?

@magnusbaeck I do have same type of problem. Can somebody please help me on this?? I am stuck at this. Please help

Thanks
Nitin Bhaisare

Sorry @magnusbaeck . I want to have fields like this:

info.log.lognm.msg.user.users: kjdfk
info.log.lognm.msg.user.commands: jhscjhs
info.log.lognm.msg.user.pririty: jhasj
info.log.lognm.msg.user.host: kjsjcksjd

But with the current configuration we are getting this in kibana,

info.log.lognm.msg.user {
"users": "kjdfk",
"commands": "jhscjhs",
"pririty": "jhasj",
"host": "kjsjcksjd"
},
{
"users": "kjdfkxc",
"commands": "jhscjhsdsf",
"pririty": "jhasjdd",
"host": "kjsdfsjcksjd"
}

Hope this is the thing,you asked for.

I wanted to have flattened event.

We got following in logstash.stdout

{
"info" => {
"tstmp" => 1.2,
"from" => "avshdd",
"hostid" => "jahgcjha",
"log" => {
"version" => "jhasgc",
"id" => "jsadh",
"jobname" => "xzcj",
"lognm" => {
"msg" => {
"groups" => {
"name" => "gasg",
"gid" => 0
},
"user" => [
[0] {
"users" => "kjdfk",
"commands" => "jhscjhs",
"pririty" => "jhasj",
"host" => "kjsjcksjd"
},
[1] {
"users" => "kjdfkxc",
"commands" => "jhscjhsdsf",
"pririty" => "jhasjdd",
"host" => "kjsdfsjcksjd"
}
]
},
"severity" => "info"
}
},
"mtype" => "kjsjs"
},
"@version" => "1",
"@timestamp" => "2017-04-13T07:36:54.912Z",
"path" => "/var/log/123.log",
"host" => "test"
}

Thanks and regards,
Shweta

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.