Hi,
I am trying to create a pipeline with Kafka 0.9 as input. As far as I understand, I've got to use the 4.x.x version of the input. I've removed and installed the proper version and tried running the pipeline, but ran into some issues. My conf -
input { kafka { bootstrap_servers => "i-xetl1:9092" group_id => "elk" topics => ["devOpsPipeLine"] auto_offset_reset => "earliest" } }
filter {
}
output { elasticsearch { hosts => [ "127.0.0.1:9200" ] index => "pipeline" } stdout {} }
With this conf, I get the following errors (last error is after ^C) -
[2017-04-26T13:27:14,322][ERROR][logstash.inputs.kafka ] Unable to create Kafka consumer from given configuration {:kafka_error_message=>org.apache.kafka.common.KafkaException: Failed to construct kafka consumer}
[2017-04-26T13:27:14,324][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::Kafka bootstrap_servers=>"i-xetl:9092", topics=>["devOpsPipeLine"], id=>"b8994149d0c345676aeb3bd773110ebf49d53340-1", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_bdcecd42-b94b-437d-bb81-2755f76c15a4", enable_metric=>true, charset=>"UTF-8">, auto_commit_interval_ms=>"10", client_id=>"logstash", consumer_threads=>1, enable_auto_commit=>"true", group_id=>"logstash", key_deserializer_class=>"org.apache.kafka.common.serialization.StringDeserializer", session_timeout_ms=>"30000", value_deserializer_class=>"org.apache.kafka.common.serialization.StringDeserializer", poll_timeout_ms=>100, ssl=>false, decorate_events=>false>
Error: uncaught throw Failed to construct kafka consumer in thread 0x318d2
[2017-04-26T13:27:14,422][WARN ][logstash.runner ] SIGINT received. Shutting down the agent.
[2017-04-26T13:27:14,431][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}
[2017-04-26T13:27:14,433][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<NoMethodError: undefined methodeach' for nil:NilClass>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-4.2.0/lib/logstash/inputs/kafka.rb:163:in
stop'", "/opt/logstash/logstash-core/lib/logstash/inputs/base.rb:89:indo_stop'", "org/jruby/RubyArray.java:1613:in
each'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:468:inshutdown'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:417:in
stop_pipeline'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:433:inshutdown_pipelines'", "org/jruby/RubyHash.java:1342:in
each'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:433:inshutdown_pipelines'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:139:in
shutdown'", "/opt/logstash/logstash-core/lib/logstash/runner.rb:279:inexecute'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:67:in
run'", "/opt/logstash/logstash-core/lib/logstash/runner.rb:183:inrun'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:132:in
run'", "/opt/logstash/lib/bootstrap/environment.rb:71:in `(root)'"]}
After playing around, it seemed the bootstrap_servers was the problematic one (If I removed it ran successfully). I tried running the same conf with either more servers, as an array but none worked.
Replacing the hostname for IP meant he managed to run but did not even connect (Putting a fake IP also worked).
Would appreciate any help.