Prevent logstash from adding fields to json input

Hi
I've got a simple setup with a tcp input and kafka output.

input {
  tcp {
    port => "${TCP_PORT:1514}"
    codec => json
  }
}

output {
  stdout { codec => rubydebug }
  kafka {
    bootstrap_servers => "${KAFKA_BROKERS}"
    topic_id => "${KAFKATOPIC}"
    codec => json
  }
}

I'm sending some simple json:
{
"name": "test_message_1",
"identifier": "randomID"
}

All I want to see in kafka is the same message as I'm putting into logstash. no @Version, no @Timestamp.

What I am getting out however is:
{"name":"test_message_1","identifier":"008b8469-287c-4d52-adfc-6821f06af411","host":"XXX.XXX.XXX.XXX","port":54122,"@timestamp":"2019-02-04T17:26:24.837Z","@version":"1"}

I thought I had this working for a while by specifying the codec as json (I've tried json_lines too), but either something has changed recently or I've missed something previously.

I've recently been using logstash 6.5.3 but have also tried 6.4.3 (that was the version I was using when I first thought I had this working).

My only other config is
config:
config.reload.automatic: "true"
path.config: /usr/share/logstash/pipeline
path.data: /usr/share/logstash/data
queue.checkpoint.writes: 1
queue.drain: "true"
queue.max_bytes: 1gb
queue.type: persisted

Does anybody have any idea whats going on?
This is doing my head in :frowning:

Thanks!

If you have no filters then do not both parsing the json, and use a plain codec. For example,

output { stdout { codec => plain { format => "%{message}
" } } }

doesn't work unfortunately :frowning: I've tried codec => line { format => "%{message}"} in the past, same result.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.