Log JSON decoding errors of TCP input

I'm setting up a Filebeat TCP input, that parses each line as JSON and sends the resulting fields to Elasticsearch.

filebeat.inputs:
- type: tcp
  host: "0.0.0.0:5000"

processors:
- decode_json_fields:
    fields: ["message"]
    target: ""
    overwrite_keys: true
- drop_fields:
    fields: ["message"]

output.elasticsearch:
  hosts: ['${ES_HOST}']
  username: '${ES_USERNAME}'
  password: '${ES_PASSWORD}'

When I send invalid JSON, an empty log entry is sent to Elasticsearch.

echo '{"test' | nc 127.0.0.1 5000

Is there a way to add the JSON decoding error to the log message? I can see the "log" input has an option add_error_key, but I can't find something similar for "tcp".

Thanks
Reto

With your configuration the parsing is not done in the input, but the processor. Checking the docs and the code I didn't see any support for setting an error on failure. Please open an enhancement request.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.