Hi,
I was using logstash-input-jms, to read messages transform them (xml parsing) and putting them into elasticsearch.
It worked quite well although under marginally higher load (42k messages per minute) it would slowdown a bit and I would just increase the number of logstash instances and it would be fine.
I thought moving to Kafka-input would be better. The "filter" and the "output" sections are exactly the same as the one used for logstash-input-jms.
Kafka version -- kafka_2.11-0.10.2.0 (and associated zookeeper)
1 partition, I haven't changed anything at all, just the vanilla version.
Logstash -- 2.4.0
Elasticsearch -- 2.4.0
Here's the input,
input {
kafka {
zk_connect => "zoo003:9620"
topic_id => "Push.ElasticSearch.1"
consumer_id => "elastic"
group_id => "elastic"
auto_offset_reset => "largest"
}
}
Adding more instances of logstash doesn't help, in fact when I do a "top", 1st instance CPU is about 275% (32 cores) and the 2nd instance is 0-1%, so I know its doing nothing.
Its just not able to keep up with the messages coming in, at a rate of around 40K per minute. Which I think is not a lot.
Am I missing something? Can you please help.
Thanks,
Arun