Hello Everybody,
Hope somebody could answer my question.
I have an app, that writes logs in json format, that are ALREADY prepared as message for elastic.
one example of what I have in logs:
{"@timestamp":"2023-09-29T07:02:48.361Z","log.level":"info","log.label":null,"log.namespace":"SudreyestrQueue","message":113753266,"client":{"ip":null},"labels":["cid_a457abfc","job_edrsr_sync_docs","entityName__documents.csv"],"meta":{"doc_id":113753266,"cause_num":"554/3112/22","date_publ":"2023-09-29","content_length":198178,"visible_status":0,"stage":"proc_index","status":"ok","description":null},"state":{"entityName":"documents.csv","cid":"a457abfc"}}
what I see,is that filebeat does not correctly passes such logs entries to elastic.
Is this doable at all? or should I have my logging rewritten?
- type: filestream
# Unique ID among all inputs, an ID is required.
id: dhimp-imports
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/ldb/log/app.filebeat.dev.log
json.keys_under_root: true
json.overwrite_keys: true
json.add_error_key: true
json.expand_keys: true
parsers:
- ndjson:
target: ""
overwrite_keys: true
add_error_key: true
expand_keys: true
setup.template.overwrite: true
processors:
- add_host_metadata:
when.not.contains.tags: forwarded
- add_cloud_metadata: ~
- add_docker_metadata: ~
- add_kubernetes_metadata: ~