Prevent logstash from adding fields to json input

I've got a simple setup with a tcp input and kafka output.

input {
  tcp {
    port => "${TCP_PORT:1514}"
    codec => json

output {
  stdout { codec => rubydebug }
  kafka {
    bootstrap_servers => "${KAFKA_BROKERS}"
    topic_id => "${KAFKATOPIC}"
    codec => json

I'm sending some simple json:
"name": "test_message_1",
"identifier": "randomID"

All I want to see in kafka is the same message as I'm putting into logstash. no @Version, no @Timestamp.

What I am getting out however is:

I thought I had this working for a while by specifying the codec as json (I've tried json_lines too), but either something has changed recently or I've missed something previously.

I've recently been using logstash 6.5.3 but have also tried 6.4.3 (that was the version I was using when I first thought I had this working).

My only other config is
config.reload.automatic: "true"
path.config: /usr/share/logstash/pipeline /usr/share/logstash/data
queue.checkpoint.writes: 1
queue.drain: "true"
queue.max_bytes: 1gb
queue.type: persisted

Does anybody have any idea whats going on?
This is doing my head in :frowning:


If you have no filters then do not both parsing the json, and use a plain codec. For example,

output { stdout { codec => plain { format => "%{message}
" } } }

doesn't work unfortunately :frowning: I've tried codec => line { format => "%{message}"} in the past, same result.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.