Trying to parse the content of a json into different fields

Hello, I am quite new in elasticsearch/logstash/kibana but I am trying to build a small dashboard with some logs I am getting from different machines.
The logs are formed by statement like this :

INFO - TEXT - Sat Aug 18 17:53:45 CEST 2018

{
_ "wsName": "newFile",_
_ "Connection type": "WIFI: "WifiName"",_
_ "RAM available": "32.78%",_
_ "CPU usage": "19.72%",_
_ "Internal storage available (MB)": 99999.99,_
_ "External storage available (MB)": 99999.99,_
_ "Connectipvity": "Is available: true. Is connected: true. Type connectivity: WIFI. Wifi signal level: 4 out of 5"_
}

I've used multiline in Filebeats and I am creating blocks of information for "INFO" and "ERROR", which are the two types of information I can have at the moment. That part goes more or less fine.
But my problems start when I get into the Grok part. First of all, I realised I needed a custom pattern for the timestamp, so I found this :

TIMEZ_CUSTOM (?:[PMCE][SD]T|UTC|CEST|CET) ?(+-(:?[0-5]\d)?)?
DATESTAMP_CUSTOM %{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{TIMEZ_CUSTOM } %{YEAR}

This works about fine, (added the custom patterns to the file), but my problems start when I try to parse the info inside the {}. I should retrieve some info per message from the fields inside the Json:
-From the "Connection type", I need to retrieve the WifiName and create a new field "WifiName".
-From the "Connectivity", I need to retrieve the Wifi Signal Level, for example "4 out of 5", which means creating a new field "Signal Level" or something like that.

And I am quite lost with this parsing. Should I use grok or is there any other way to achieve this?

Any help about how can I parse this? Every little help will be welcome!
Thanks!

in these cases, it's helpful to layer filters. Is the JSON blob the last bit? why not capture the whole thing to a single field, and then use the JSON filter to parse the result?

filter {
  grok {
    # ... capture the _whole_ JSON blob; store it in `[@metadata][json_payload]`
  }
  json {
    source => "[@metadata][json_payload]"
  }
}

Thank you for the answer, I think this might be the solution to my problem.

But could I split the info from inside the json somehow? Also, not all the information blocks contain json, can I link the json to an if condition? I tried but I didn't get any result.
Thanks!

What specifically did you try? How did it behave differently than you expected? Including example pipeline configurations and multiple example input events is the best way to give sufficient context for us to be of help :blush:

So, I tried something like this :

filter {
mutate {
gsub => ["message", "\n\r", "\n"]
}
grok {
patterns_dir => ["/etc/logstash/conf.d/patterns/"]
match => { "message" => "%{LOGLEVEL:loglevel} - %{GREEDYDATA:concepto} - %{DATESTAMP_CUSTOM:timestamp}\n(?<aplic_msg>[^\n])\n(?(.|\r|\n))"}
}
mutate {
gsub => ["timestamp", "CEST", "Etc/GMT-2"]
}
date {
match => [ "timestamp", "EEE MMM dd HH:mm:ss Z yyyy" ]
}
json {
source => "data"
}
}

But I am not getting any result besides the "data" field. What am I doing wrong?
Also I am facing another problem, some blocks contain json info but some doesn't.

Can I specify multiple grok patterns in one filter?

Thanks for all the help!! :smiley:

I made it work finally! Now I am fighting with the timestamp.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.