Hi,
I'm trying to get a basic Fliebeat example working and I am unable to get the message contents decomposed as JSON. I'll try and explain.
I have a simple Node application writing to a log file. The log file output looks like this:
{"name":"foo","hostname":"local","pid":21894,"level":30,"msg":"hi","time":"2017-11-03T17:02:24.990Z","v":0}
{"name":"foo","hostname":"local","pid":21894,"level":30,"msg":"hi","time":"2017-11-03T17:02:25.994Z","v":0}
{"name":"foo","hostname":"local","pid":21894,"level":30,"msg":"hi","time":"2017-11-03T17:02:26.997Z","v":0}
{"name":"foo","hostname":"local","pid":21894,"level":30,"msg":"hi","time":"2017-11-03T17:02:28.001Z","v":0}
This is saved in a file called app.log. Messages are created every second for test purposes and each log messages is on a newline.
My Filebeat config is as follows:
filebeat.prospectors:
- input_type: log
paths:
- /path/*.log
output.elasticsearch:
hosts: ["http://localhost:9200"]
json.keys_under_root: true
json.add_error_key: true
I'm hoping that the log message will be unpacked as separate fields in Elastic, but when I view the entry in Kibana, it looks like this:
{
"_index": "filebeat-2017.11.03",
"_type": "doc",
"_id": "AV-C2xUTNC5xuoUVJlSh",
"_version": 1,
"_score": null,
"_source": {
"@timestamp": "2017-11-03T17:08:28.445Z",
"beat": {
"hostname": "local",
"name": "local",
"version": "5.6.2"
},
"input_type": "log",
"message": "{\"name\":\"foo\",\"hostname\":\"local\",\"pid\":21894,\"level\":30,\"msg\":\"hi\",\"time\":\"2017-11-03T17:08:28.048Z\",\"v\":0}",
"offset": 46464,
"source": "/PATH/apps.log",
"type": "log"
},
"fields": {
"@timestamp": [
1509728908445
]
},
"sort": [
1509728908445
]
}
Why won't it parse the JSON in the message field? Ideally, I want to parse out all of the fields for analysis. Do I have to use Log Stash for this?