I have connected filebeat to logstash and process logs from my docker app. Filebeat, logstash, elsticsearch and kibana are dockerized. I would like to harvest docker logs so I implement python app, that generate this logs:
This is just message at: 2020-06-01 14:49:41.287697
{"time": "2020-06-01 14:49:42.288979", "text": "Random text"}
This is just message at: 2020-06-01 14:49:45.291579
{"time": "2020-06-01 14:49:46.292781", "text": "Random text"}
This is just message at: 2020-06-01 14:49:49.295620
{"time": "2020-06-01 14:49:50.296886", "text": "Random text"}
This is just message at: 2020-06-01 14:49:53.299630
Filebeat process this logs, that works fine. I am trying to parse json messages with decode_json_fields
. I set add_error_key: true
. Messages that are json are parsed properly, but when there is just text, message is not parsed - expected, but there is also no error message. From documentation for filebeat 7.7:
If it set to true, in case of error while decoding json keys
error field is going to be part of event with error message.
.
This is my filebeat configuration:
filebeat.inputs:
- type: container
paths:
- '/var/lib/docker/containers/*/*.log'
processors:
- add_docker_metadata:
host: "unix:///var/run/docker.sock"
- decode_json_fields:
fields: ["message"]
target: ""
add_error_key: true
overwrite_keys: true
output.logstash:
hosts: ["logstash:5000"]
logging.level: info
logging.json: true
logging.metrics.enabled: false
Expectation:
If messages can be parsed, it will be parsed and moved to root path. In case there is just string, I would expected some error message, like logstash when cant parse json message with json filter, it append tag _jsonparsefailure
.
Why does not error message is not included when parsing messages like this: This is just message at: 2020-06-01 14:49:41.287697
?