Kafka Input Plugin issue

Hi All,

I came across a scenario where I am producing logs through php applictaion directly in kafka and using logstash to index logs in elasticsearch . To test my feature whether I stopped one of the broker and I saw logs are being dropped by logstash when kafka broker comes up . I opened kafka console consumer and logstash side by side to see how many logs are being processed and I found that kafka console consumer is processing logs properly but logstash is dropping logs .

logstash config is

input {
kafka {
zk_connect => "X.X.X.X:2182,X.X.X.X:2182,X.X.X.X:2182"
group_id => "test1"
topic_id => "test"
consumer_threads => 5
consumer_restart_on_error => true
consumer_restart_sleep_ms => 100
queue_size => 100
}
}

filter {

	json {
                    source => "message"
            }

}

output { stdout { codec => rubydebug }
}

If you restart logstash, does it process the logs then?

Check for errors in the logstash log. You could try bumping rebalance_max_retries up.

@Joe_Lawson I tried restarting logstash but still all logs were not being processed , The only way I find out was to change the consumer group . I didn't tried rebalance_max_tries but will give try for sure .

But even if it doesn't works its a serious bug in kafka input plugin because we heavily rely on logstash for indexing .

Can you please check your Logstash, Kafka Broker and Zookeeper logs for
errors?