Kafka consumer issue

We are using logstash kafka input plugin to process the data from kafka.
We are running 3 log stash instances with following config:

input {
heartbeat {
interval => 60
type => "heartbeat"
add_field => { "app.name" => "LSKTOES" }
}
kafka {
zk_connect => "zk-host:2181"
topic_id => "central-logging"
consumer_threads => 4
}
}
output {
elasticsearch {
host => ["ls-host-1","ls-host-2","ls-host-3"]
port => "9200"
protocol => "http"
codec => json
workers => 4
}
}

We are running Kafka on node [one process] with 12 partations.
So with 3 log stash agents with each having 4 consumer threads should fetch the data fro 12 partitions.

But this is not working.
One of the logstash consumer, is not consuming any message and in logs i am getting these statements continuously
log4j, [2016-03-18T09:32:22.904] INFO: kafka.consumer.ConsumerFetcherManager: [ConsumerFetcherManager-1458306025713] Added fetcher for partitions ArrayBuffer()
log4j, [2016-03-18T09:32:23.115] INFO: kafka.utils.VerifiableProperties: Verifying properties
log4j, [2016-03-18T09:32:23.115] INFO: kafka.utils.VerifiableProperties: Property client.id is overridden to logstash
log4j, [2016-03-18T09:32:23.115] INFO: kafka.utils.VerifiableProperties: Property metadata.broker.list is overridden to 10.87.164.36:9092
log4j, [2016-03-18T09:32:23.115] INFO: kafka.utils.VerifiableProperties: Property request.timeout.ms is overridden to 30000
log4j, [2016-03-18T09:32:23.115] INFO: kafka.client.ClientUtils$: Fetching metadata from broker id:1,host:10.87.164.36,port:9092 with correlation id 8652 for 1 topic(s) Set(central-logging)
log4j, [2016-03-18T09:32:23.116] INFO: kafka.producer.SyncProducer: Connected to 10.87.164.36:9092 for producing
log4j, [2016-03-18T09:32:23.117] INFO: kafka.producer.SyncProducer: Disconnecting from 10.87.164.36:9092
log4j, [2016-03-18T09:32:23.117] INFO: kafka.consumer.ConsumerFetcherManager: [ConsumerFetcherManager-1458306025713] Added fetcher for partitions ArrayBuffer()
log4j, [2016-03-18T09:32:23.336] INFO: kafka.utils.VerifiableProperties: Verifying properties
log4j, [2016-03-18T09:32:23.337] INFO: kafka.utils.VerifiableProperties: Property client.id is overridden to logstash
log4j, [2016-03-18T09:32:23.337] INFO: kafka.utils.VerifiableProperties: Property metadata.broker.list is overridden to 10.87.164.36:9092
log4j, [2016-03-18T09:32:23.337] INFO: kafka.utils.VerifiableProperties: Property request.timeout.ms is overridden to 30000
log4j, [2016-03-18T09:32:23.337] INFO: kafka.client.ClientUtils$: Fetching metadata from broker id:1,host:10.87.164.36,port:9092 with correlation id 8653 for 1 topic(s) Set(central-logging)
log4j, [2016-03-18T09:32:23.337] INFO: kafka.producer.SyncProducer: Connected to 10.87.164.36:9092 for producing
log4j, [2016-03-18T09:32:23.338] INFO: kafka.producer.SyncProducer: Disconnecting from 10.87.164.36:9092
log4j, [2016-03-18T09:32:23.338] INFO: kafka.consumer.ConsumerFetcherManager: [ConsumerFetcherManager-1458306025713] Added fetcher for partitions ArrayBuffer()

When I look the kafka server log, I am seeing the below logs which indicates server is closing the sockets continuously on consumers. Not very sure why connection is getting closed continuously.
[2016-03-18 09:37:56,147] INFO Closing socket connection to /10.100.124.47. (kafka.network.Processor)
[2016-03-18 09:37:56,234] INFO Closing socket connection to /10.100.124.49. (kafka.network.Processor)
[2016-03-18 09:37:56,338] INFO Closing socket connection to /10.100.124.48. (kafka.network.Processor)

The IPs mentioned in the above logs are 3 logstash consumer IPs.
Any idea what might be wrong?

I'm guessing you are getting an error in the Kafka pipeline and logstash is restarting over and over. Try running in --debug and set on the kafka input consumer_restart_on_error => false (https://www.elastic.co/guide/en/logstash/current/plugins-inputs-kafka.html#plugins-inputs-kafka-consumer_restart_on_error)

I tried with the option mentioned, but still seeing the same issue.
Any further ideas please?