Logstash Kafka input causes duplicates

I am using Logstash 5.5.1 with kafka input 5.1.8.
When consuming the topic, logstash forwards many duplicates to the output, while the event it self was consumed only once.

Is this a bug or a configuration issue?

This causing me to believe logstash is not reliable as a kafka consumer, since using other libraries that consume from kafka do not exhibit this kind of issues...

Seeing the smallest possible configuration that reproduces the issue would probably be helpful to those who know Kafka and the kafka input plugin.

This is the logstash.conf. Workers and batch size are default (to 5.5.1). The topics that are being consumed produce 100's of thousands events a day

input {
  kafka {
    bootstrap_servers => "${KAFKA_BROKER_ADDRESS}"
    topics_pattern => "${KAFKA_CONSUMED_TOPICS_PATTERN}"
  }
}

filter {
  json {
    source => "message"
  }
}

output {
  http {
    url => "${URL}"
    http_method => 'post'
    socket_timeout => 180
    pool_max_per_route => 50
    automatic_retries => 5
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.