I'm trying to parse JSON logs our server application is producing. It's writing to 3 log files in a directory I'm mounting in a Docker container running Filebeat. So far so good, it's reading the log files all right. However, in Kibana, the messages arrive, but the content itself it just shown as a field called "message" and the data in the content field is not accessible via its own fields (like "source_address", etc.).
I'm writing the logs using logrus and I want Beats to pass them straight on to Elasticsearch. I went through all kinds of documentation and found multiple ways of achieving what I want, but none of it's working.
This is my current configuration for filebeat:
filebeat.prospectors:
- paths:
- /mnt/log/*.log
input_type: log
multiline.pattern: '^{'
multiline.negate: false
multiline.match: after
processors:
- decode_json_fields:
fields: ['message']
target: json
output.elasticsearch:
hosts: ["elasticsearch:9200"]
template.name: filebeat
template.path: filebeat.template.json
A little nudge in the right direction would be greatly appreciated.