Logstash-kafka plugin


(Nitish Upadhyay) #1

Greetings Everyone,
I am a very raw user of logstash and in much a phase of learning it, however I am facing a very curious case in logstash-kafka setup. when I try to send the traffic from logstash to kafka, and if kafka isn't available the logstash just gets crashed and doesn't even try to resend the traffic. I want a case where logstash should give few tries to send the traffic to kafka, and even if it doesn't find kafka at all, it should dump the data somewhere in blackhole until it gets the kafka back instead of getting crashed. is it possible to achieve this with logstash? thank you guys,
hope you see you reply soon.


(Joe Lawson) #2

Could you share your config? Generally Kafka and logstash should block, ie
not send messages, if it is unable to communicate with the broker. There
are a number of settings that can affect how it reacts to errors. Did you
get any errors?


(Nitish Upadhyay) #3

input {
udp {
port => 9995
codec => netflow
}
}

output {
kafka {
codec => json
broker_list => "node5:9092"
topic_id => netflow
}

stdout { codec => rubydebug }

file{

codec => rubydebug

path => "/home/vagrant/netflow-%{+YYYY-MM-dd-HH}.json"

}

}


(system) #4