I am using Kafka plugin in logstash to read items and push to ES. After processing 3M messages, I can see the logstash process remains at 100% CPU continuously, although it is doing nothing.
Here's my config:
input {
beats {
port => 5048
type => 'hit'
}
kafka {
zk_connect => "10.35.132.117:2181,10.187.147.57:2181,10.187.147.58:2181"
topic_id => "rsyslog_logstash"
reset_beginning => false
type => "kafka"
}
}
filter {
drop {}
}
if [type] == "kafka" {
date {
match => [ "timestamp", "ISO8601" ]
locale => "en"
}
}
metrics {
meter => "events"
add_tag => "metric"
}
}
output {
if "metric" in [tags] {
stdout {
codec => line {
format => "rate: %{[events][rate_1m]}"
}
}
}
if [fingerprint] {
elasticsearch {
hosts => ["10.35.76.37","10.35.132.143","10.35.132.142"]
index => "hits-%{+YYYY.MM.dd}"
template => "./conf/hits.json"
template_name => "hits"
document_type => "hit"
template_overwrite => true
manage_template => true
flush_size => "1000"
}
} else {
}
}
kill did not kill the process. Had to do kill -3.