Logstash web-hdfs plugin problem

Good morning:

I am using ELK 6.2.2 to parse many kinds of files. All the other plugins work just fine. But I am getting two errors when attempting to use the webhdfs output plugin. It's not very specific(at least to me), so I am struggling to debug. The error message and configuration file are listed below. Has anyone seen this before?

ERROR MESSAGES:
[ERROR] 2018-09-10 16:40:26.109 [[main]-pipeline-manager] pipeline - Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::OutputDelegator:0x65c653d3 @namespaced_metric=#<LogStash::Instrument::NamespacedMetric:0x5a159150 @metric=#<LogStash::Instrument::Metric:0xd8679e4 @collector=#<LogStash::Instrument::Collector:0x157227bb @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x427db60d @store=#<Concurrent::map:0x00000000000fac entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x34143003, @fast_lookup=#<Concurrent::map:0x00000000000fb0 entries=59 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs, :cvs_webhdfs_test]>, @metric=#<LogStash::Instrument::NamespacedMetric:0xea89734 @metric=#<LogStash::Instrument::Metric:0xd8679e4 @collector=#<LogStash::Instrument::Collector:0x157227bb @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x427db60d @store=#<Concurrent::map:0x00000000000fac entries=3 default_proc=nil>, @structured_lookup
_mutex=#Mutex:0x34143003, @fast_lookup=#<Concurrent::map:0x00000000000fb0 entries=59 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs]>, @out_counter=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @strategy=#<LogStash::OutputDelegatorStrategies::Legacy:0x344b2d20 @worker_count=1, @workers=[<LogStash::Outputs::WebHdfs
[ERROR] 2018-09-10 16:40:26.138 [Ruby-0-Thread-1: /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:22] agent - Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}

CONFIGUTATION FILE:
input {
file {
id => "csv-hdfs"
path => "/home/eda-cm-install.svc/directory19a/*.csv"
start_position => beginning
sincedb_path => "/dev/null"
ignore_older => 0

}

}
filter {
csv {
columns => ["Node UUID", "IP Address", "Node Name", "MIB Expression", "Time Stamp (ms)",
"Poll Interval (ms)", "MIB Instance", "Metric Value", "Display Attribute", "Filter Value"]
separator => ","
remove_field => ["message"]
}
}
output {
stdout { codec => rubydebug }
webhdfs {
host => "hostname-eda-local"
port => 8888
#port => 50070
path => "/fdn/test_webhdfs/dt=%{+YYYY-MM-dd}/logstash-%{+HH}.csv" # (required)
user => "eda-cm-install.svc"
ssl_cert => "/home/eda-cm-install.svc/conf/jssecacerts"
use_ssl_auth => true
id => "cvs_webhdfs_test"
}

}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.