Until now, we have had Logstash produce its log messages in plain-text format (written to /var/log/logstash/logstash-plain.log
). And we had Filebeat ship the log messages to a Logstash cluster where the log messages were processed using various grok/match filters.
But now I want to have Logstash produce log message in JSON and avoid all the grok'ing and matching. Getting Logstash to produce log messages in JSON (or ndjson - Newline Delimited JSON) is easy. Just add the --log.format json
to the command that starts Logstash, as in
/usr/share/logstash/bin/logstash" --path.settings "/etc/logstash" --log.format "json"
And out comes log messages like this in /var/log/logstash/logstash-json.log
:
{"level":"INFO","loggerName":"logstash.javapipeline","timeMillis":1690897355410,"thread":"[main]-pipeline-manager","logEvent":{"message":"Pipeline started","pipeline.id":"main"}}
{"level":"INFO","loggerName":"logstash.agent","timeMillis":1690897355457,"thread":"Agent thread","logEvent":{"message":"Pipelines running","count":1,"running_pipelines":[{"metaClass":{"metaClass":{"metaClass":{"running_pipelines":"[:main]","non_running_pipelines":[]}}}}]}}
But what do I put in my Filebeat configuration (/etc/filebeat/conf.d/logstash.yml
) for shipping these message to Logstash? Something like this I hope:
- type: log
paths:
- /var/log/logstash/logstash-json.log
encoding: plain
document_type: json
json:
keys_under_root: true
overwrite_keys: true
add_error_key: true
[more stuff]
fields:
type: logstash
server: ********
fields_under_root: false
Do I need anything else? I have experimented with the message_key
option, both as:
message_key: logEvent
and message_key: logEvent.message
, but without any noticeable result or difference.
Over at the Logstash side, I have been experimenting with a filter-configuration like this:
filter {
if [fields][type] == "logstash" {
json {
source => "message"
}
}
}
output {
if [fields][type] == "logstash" {
elasticsearch {
[blah blah blah]
}
}
}
It sorts of work. The log messages appears and can be viewed using Kibana, but every single message has a _grokparsefailure
tag attached to it even though I do not have a grok filter in the logstash.conf
file?!?
And there is no message
field, only a logEvent.message
field.
Anyone with a working, minimal Filebeat configuration, and a working minimal Logstash filter that can help me out? Why do I get a _grokparsefailure
when I am not grokking, and how do I copy the information in the logEvent
fields to the (normal) message
field?