I used kafka as my input plugin like this :
`input{
kafka{
bootstrap_servers=>"192.168.23.125:9092,192.168.23.126:9092,192.168.23.127:9092"
topics=>["Apache_log"]
group_id=>"test-consumer-group"
consumer_threads => 3
auto_offset_reset=>"earliest"
max_poll_records => "10"
session_timeout_ms => "60000"
request_timeout_ms => "63000"
}
}
output
{
file {
path =>"/data/logs/apache.log"
}
}``
It is default for other settings. Logstash started well,but after a while ,the error OutOfMemory error occured . after increased JVM heap size to 8G and restarted lostash , it was longer than before . Why does logstash read data from kafka ,and nothing flush to file?