Hi,
I have logstash 2.3 setup with elasticsearch 2.3. I am using the elasticsearch output and the kafka input. My input consumes from a single topic in kafka and the inputs section looks like this:
input {
kafka {
white_list => "topic"
consumer_threads => 1
queue_size => 50
codec => plain
zk_connect => "zk_path"
group_id => "logs_consumers"
type => "log"
}
}
output looks like this:
output {
if [type] == "log"{
elasticsearch {
index => "index"
flush_size => 5000
idle_flush_time => 30
document_type => "logl"
hosts => [hosts]
}
}
I first ran it with 4g of ram and the process gave an OutOfMemory Error and died. I tried with memory values all the way up to 7g for the JVM and it still runs out of memory. It seems like this should not be happening.
I am running with -w 2 and its a 2 core machine. I had a very similar configuration in logstash 1.5 and never ran into this issue.
Any help in figuring out what is causing the massive memory usage here would be very appreciated. Thanks