"Failed to flush outgoing items" Error with logstash-output-webhdfs plugin

Dear all,

When I tried use logstash-output-webhdfs plugin to archive logs into hdfs, I met below error occasionly(not always). Does anyone know what happened and how to fix this ?

Failed to flush outgoing items {:outgoing_count=>168, :exception=>"WebHDFS::IOError", :backtrace=>["/home/cloudera/elasticsearch/logstash/logstash/vendor/bundle/jruby/1.9/gems/webhdfs-0.8.0/lib/webhdfs/client_v1.rb:401:in request'", "/home/cloudera/elasticsearch/logstash/logstash/vendor/bundle/jruby/1.9/gems/webhdfs-0.8.0/lib/webhdfs/client_v1.rb:270:inoperate_requests'", "/home/cloudera/elasticsearch/logstash/logstash/vendor/bundle/jruby/1.9/gems/webhdfs-0.8.0/lib/webhdfs/client_v1.rb:73:in create'", "/home/cloudera/elasticsearch/logstash/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-webhdfs-3.0.2/lib/logstash/outputs/webhdfs.rb:210:inwrite_data'", "/home/cloudera/elasticsearch/logstash/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-webhdfs-3.0.2/lib/logstash/outputs/webhdfs.rb:205:in write_data'", "/home/cloudera/elasticsearch/logstash/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-webhdfs-3.0.2/lib/logstash/outputs/webhdfs.rb:195:inflush'", "org/jruby/RubyHash.java:1342:in each'", "/home/cloudera/elasticsearch/logstash/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-webhdfs-3.0.2/lib/logstash/outputs/webhdfs.rb:183:inflush'", "/home/cloudera/elasticsearch/logstash/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:219:in buffer_flush'", "org/jruby/RubyHash.java:1342:ineach'", "/home/cloudera/elasticsearch/logstash/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:216:in buffer_flush'", "/home/cloudera/elasticsearch/logstash/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:193:inbuffer_flush'", "/home/cloudera/elasticsearch/logstash/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:112:in buffer_initialize'", "org/jruby/RubyKernel.java:1479:inloop'", "/home/cloudera/elasticsearch/logstash/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:110:in `buffer_initialize'"], :level=>:warn}

my configuration as follows:
webhdfs {
codec => json
host => "master"
port => 50070
path => "/user/logstash/access_error/dt=%{log_date}/logstash-%{+HH}.log"
user => "hdfs"
}

Hello @jackhe,

Is there any mention of retries in your log before the error you shared with us? From the source code, the plugin should retry theses kinds of errors.

Hi @pierhugues,

Thanks a lot for your response.

No, the error I shared is the whole thing. And when it happens, It keeps coming with every event.

I didn't modify retries_times in my conf file.

Now I am trying "retry_times => 100", so far so good.