Hello, I am quite new in elasticsearch/logstash/kibana but I am trying to build a small dashboard with some logs I am getting from different machines.
The logs are formed by statement like this :
INFO - TEXT - Sat Aug 18 17:53:45 CEST 2018
{
_ "wsName": "newFile",_
_ "Connection type": "WIFI: "WifiName"",_
_ "RAM available": "32.78%",_
_ "CPU usage": "19.72%",_
_ "Internal storage available (MB)": 99999.99,_
_ "External storage available (MB)": 99999.99,_
_ "Connectipvity": "Is available: true. Is connected: true. Type connectivity: WIFI. Wifi signal level: 4 out of 5"_ }
I've used multiline in Filebeats and I am creating blocks of information for "INFO" and "ERROR", which are the two types of information I can have at the moment. That part goes more or less fine.
But my problems start when I get into the Grok part. First of all, I realised I needed a custom pattern for the timestamp, so I found this :
This works about fine, (added the custom patterns to the file), but my problems start when I try to parse the info inside the {}. I should retrieve some info per message from the fields inside the Json:
-From the "Connection type", I need to retrieve the WifiName and create a new field "WifiName".
-From the "Connectivity", I need to retrieve the Wifi Signal Level, for example "4 out of 5", which means creating a new field "Signal Level" or something like that.
And I am quite lost with this parsing. Should I use grok or is there any other way to achieve this?
Any help about how can I parse this? Every little help will be welcome!
Thanks!
in these cases, it's helpful to layer filters. Is the JSON blob the last bit? why not capture the whole thing to a single field, and then use the JSON filter to parse the result?
filter {
grok {
# ... capture the _whole_ JSON blob; store it in `[@metadata][json_payload]`
}
json {
source => "[@metadata][json_payload]"
}
}
Thank you for the answer, I think this might be the solution to my problem.
But could I split the info from inside the json somehow? Also, not all the information blocks contain json, can I link the json to an if condition? I tried but I didn't get any result.
Thanks!
What specifically did you try? How did it behave differently than you expected? Including example pipeline configurations and multiple example input events is the best way to give sufficient context for us to be of help
But I am not getting any result besides the "data" field. What am I doing wrong?
Also I am facing another problem, some blocks contain json info but some doesn't.
Can I specify multiple grok patterns in one filter?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.