ENV:
- Ambari 2.7
- HDP 3.1
- Enabled Kerberos and AD
It then installs gssapi for logstash according to the operation of the use kerberos webhdfs start error with exception "no such file to load -- gssapi" dengshaochun host. After that, and then runs it again with an error.
[2019-04-30T10:36:39,079][DEBUG][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main"}
[2019-04-30T10:36:39,658][ERROR][logstash.outputs.webhdfs ] Webhdfs check request failed. (namenode: m1.node.hadoop:50070, Exception: undefined method `read_uint32' for #<FFI::MemoryPointer address=0x7f39740901a0 size=4>)
[2019-04-30T10:36:39,665][DEBUG][logstash.outputs.stdout ] Closing {:plugin=>"LogStash::Outputs::Stdout"}
[2019-04-30T10:36:39,711][ERROR][logstash.javapipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<WebHDFS::KerberosError: undefined method `read_uint32' for #<FFI::MemoryPointer address=0x7f39740901a0 size=4>>, :backtrace=>["/usr/local/share/logstash-7.0.0/vendor/bundle/jruby/2.5.0/gems/webhdfs-0.8.0/lib/webhdfs/client_v1.rb:323:in `request'", "/usr/local/share/logstash-7.0.0/vendor/bundle/jruby/2.5.0/gems/webhdfs-0.8.0/lib/webhdfs/client_v1.rb:275:in `operate_requests'", "/usr/local/share/logstash-7.0.0/vendor/bundle/jruby/2.5.0/gems/webhdfs-0.8.0/lib/webhdfs/client_v1.rb:138:in `list'", "/usr/local/share/logstash-7.0.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-webhdfs-3.0.6/lib/logstash/outputs/webhdfs_helper.rb:49:in `test_client'", "/usr/local/share/logstash-7.0.0/vendor/bundle/jruby/2.5.0/gems/logstash-output-webhdfs-3.0.6/lib/logstash/outputs/webhdfs.rb:155:in `register'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:106:in `register'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:48:in `register'", "/usr/local/share/logstash-7.0.0/logstash-core/lib/logstash/java_pipeline.rb:191:in `block in register_plugins'", "org/jruby/RubyArray.java:1792:in `each'", "/usr/local/share/logstash-7.0.0/logstash-core/lib/logstash/java_pipeline.rb:190:in `register_plugins'", "/usr/local/share/logstash-7.0.0/logstash-core/lib/logstash/java_pipeline.rb:445:in `maybe_setup_out_plugins'", "/usr/local/share/logstash-7.0.0/logstash-core/lib/logstash/java_pipeline.rb:203:in `start_workers'", "/usr/local/share/logstash-7.0.0/logstash-core/lib/logstash/java_pipeline.rb:145:in `run'", "/usr/local/share/logstash-7.0.0/logstash-core/lib/logstash/java_pipeline.rb:104:in `block in start'"], :thread=>"#<Thread:0x7cac5f13 run>"}
[2019-04-30T10:36:39,738][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2019-04-30T10:36:39,794][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-04-30T10:36:39,795][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-04-30T10:36:39,834][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2019-04-30T10:36:39,877][DEBUG][logstash.agent ] Starting puma
[2019-04-30T10:36:39,883][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2019-04-30T10:36:39,888][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2019-04-30T10:36:39,894][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2019-04-30T10:36:39,893][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2019-04-30T10:36:39,940][DEBUG][logstash.api.service ] [api-service] start
[2019-04-30T10:36:40,068][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-04-30T10:36:44,971][DEBUG][logstash.agent ] Shutting down all pipelines {:pipelines_count=>0}
[2019-04-30T10:36:44,981][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>0}
[2019-04-30T10:36:44,987][INFO ][logstash.runner ] Logstash shut down.
Below is the log that docker runs
[2019-04-30T03:32:59,304][ERROR][logstash.outputs.webhdfs ] Webhdfs check request failed. (namenode: m1.node.hadoop:50070, Exception: undefined method `read_uint32' for #<FFI::MemoryPointer address=0x7fa824119b90 size=4>
Did you mean? read_uint
read_int
read_array_of_uint32
read_array_of_int32
read_pointer
read_ulong
read_string
read_ushort
read_array_of_uint64
read_array_of_uint16
get_uint32)
Thank you for your help.