Hi,
I'm using filebeat and elasticsearch 7.5.2 and i am trying to stream specific lines from a local log file to ES.
The lines i am interested in have the following format
{"level":30,"time":1579750597224,"pid":32172,"hostname":"abc","name":"sdk","data":{"TYPE":"REQUEST","UUID":"47DE724F-4198-4563-9BB8-3EA2498B5E4F","METHOD":"POST","URL":"...},"msg":"[47DE724F-4198-4563-9BB8-3EA2498B5E4F] - Request on /api","v":1}
My filebeat config looks like this
- type: log
enabled: true
paths:
- /path/to/log/*output.log
processors:
- drop_event:
when:
not:
regexp:
message: '^(.*?)(Request|Response)(.*)'
- decode_json_fields:
fields: ["message"]
process_array: false
max_depth: 1
target: ""
overwrite_keys: true
add_error_key: true
I only want lines that have the nested "data" JSON object or contain the words i regex for in the above config.
The problem is that the example above seems to drop every log line even the ones that are valid. If i omit the decode_json_fields processor the correct logs appear in elasticsearch but i get the message as a string which is useless.
If i omit the decode_json_fields and add the following to the config
json.keys_under_root: true
json.add_error_key: true
Then i get nothing in ES again.
It seems that i am missing something related to the order in which processors are executed.
Can someone help?
Thanks