My processor for one of my Filebeat instances looks like:
- type: log
processors:
- dissect:
#2021-12-08T08:34:04.370+0100 INFO [monitoring] log/log.go:144 Non-zero metrics in the last 30s {"monitoring": {"metrics
#datatype: string to integer, long, float, double, boolean or ip
tokenizer: "%{date}\t%{event.type}\t%{class}\t%{script}\t%{messageCut}"
field: "message"
target_prefix: ""
- timestamp:
field: date
layouts:
- '2006-01-02T15:04:05.999Z07:00'
- '2006-01-02T15:04:05.999Z0700'
- '2006-01-02T15:04:05.999999999Z07:00'
#- '2006-01-02T15:04:05.999-07:00'
test:
- '2021-12-08T08:34:04.370+0100'
- drop_fields:
fields: ["date", "class", "script", "message"]
- rename:
fields:
- from: "messageCut"
to: "message"
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/data/log/heartbeat/heartbeat.log
fields:
service.type: heartbeat
event.module: heartbeat
event.dataset: heartbeat.beat
fields_under_root: true
We mostly do processing in logstash, which was built for this purpose.
You could easily add a different processor in filebeat to add fields.