Logstash outofmemory error

I used kafka as my input plugin like this :
`input{
kafka{
bootstrap_servers=>"192.168.23.125:9092,192.168.23.126:9092,192.168.23.127:9092"
topics=>["Apache_log"]
group_id=>"test-consumer-group"
consumer_threads => 3
auto_offset_reset=>"earliest"
max_poll_records => "10"
session_timeout_ms => "60000"
request_timeout_ms => "63000"
}

}
output
{
file {
path =>"/data/logs/apache.log"
}
}``
It is default for other settings. Logstash started well,but after a while ,the error OutOfMemory occured . after increased JVM heap size to 8G and restarted lostash , it was longer than before . Why does logstash read data from kafka ,and nothing flush to file?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.