Logstash Kafka input causes duplicates


(Guyisra) #1

I am using Logstash 5.5.1 with kafka input 5.1.8.
When consuming the topic, logstash forwards many duplicates to the output, while the event it self was consumed only once.

Is this a bug or a configuration issue?

This causing me to believe logstash is not reliable as a kafka consumer, since using other libraries that consume from kafka do not exhibit this kind of issues...


(Magnus B├Ąck) #2

Seeing the smallest possible configuration that reproduces the issue would probably be helpful to those who know Kafka and the kafka input plugin.


(Guyisra) #3

This is the logstash.conf. Workers and batch size are default (to 5.5.1). The topics that are being consumed produce 100's of thousands events a day

input {
  kafka {
    bootstrap_servers => "${KAFKA_BROKER_ADDRESS}"
    topics_pattern => "${KAFKA_CONSUMED_TOPICS_PATTERN}"
  }
}

filter {
  json {
    source => "message"
  }
}

output {
  http {
    url => "${URL}"
    http_method => 'post'
    socket_timeout => 180
    pool_max_per_route => 50
    automatic_retries => 5
  }
}

(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.