Getting "Failed to flush outgoing items {exception=>WebHDFS::ServerError}" while using output plugin 'logstash.outputs.webhdfs'

Hi,

I am using 'logstash.outputs.webhdfs' logstash output plugin. Below is my logstash configuration

  input {
        file {
			type => "file1"
			path => "/var/log/logstash/logstash-1/*"
			start_position => "beginning"
			sincedb_path => "/var/log/logstash/logFile1.log.pos"
		}
  }

  fileter{
         # some parsing/filters on input files
  }

  output {
	  webhdfs {
		host => "192.168.4.250"
		port => 50070
		path => "/tmp/app.log"
		user => "hdfs"
	  }
  } 

But it is continuosly giving following error

$ [2020-02-27T07:37:23,254][WARN ][logstash.outputs.webhdfs ] Failed to flush outgoing items {:outgoing_count=>1, :exception=>"WebHDFS::ServerError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/webhdfs-0.8.0/lib/webhdfs/client_v1.rb:351:in request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/webhdfs-0.8.0/lib/webhdfs/client_v1.rb:270:in operate_requests'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/webhdfs-0.8.0/lib/webhdfs/client_v1.rb:73:in create'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-webhdfs-3.0.6/lib/logstash/outputs/webhdfs.rb:228:in write_data'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-webhdfs-3.0.6/lib/logstash/outputs/webhdfs.rb:211:in block in flush'", "org/jruby/RubyHash.java:1419:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-webhdfs-3.0.6/lib/logstash/outputs/webhdfs.rb:199:in flush'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/stud-0.0.23/lib/stud/buffer.rb:219:in block in buffer_flush'", "org/jruby/RubyHash.java:1419:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/stud-0.0.23/lib/stud/buffer.rb:216:in buffer_flush'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/stud-0.0.23/lib/stud/buffer.rb:159:in buffer_receive'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-webhdfs-3.0.6/lib/logstash/outputs/webhdfs.rb:182:in receive'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:89:in block in multi_receive'", "org/jruby/RubyArray.java:1792:in each'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:89:in multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:118:in multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:390:in block in output_batch'", "org/jruby/RubyHash.java:1419:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:389:in output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:341:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:304:in block in start_workers'"]}

Then I also tried webhdfs api manually, And It is working fine
Here is my output of webhdfs api

$ curl -i -X PUT "http://192.168.4.250:50070/webhdfs/v1/tmp/webhdfs/webhdfs-test.txt?user.name=hdfs&op=CREATE"
HTTP/1.1 307 TEMPORARY_REDIRECT
Content-Type: application/octet-stream
Expires: Thu, 01-Jan-1970 00:00:00 GMT
Set-Cookie: hadoop.auth="u=hdfs&p=hdfs&t=simple&e=1370210936666&s=BLAIjTpNwurdsgvFxNL3Zf4bzpg=";Path=/
Location: http://192.168.4.250:50075/webhdfs/v1/tmp/webhdfs/webhdfs-test.txt?op=CREATE&user.name=hdfs&overwrite=false
Content-Length: 0
Server: Jetty(6.1.26)

$ curl -i -T webhdfs-test.txt "http://192.168.4.250:50075/webhdfs/v1/tmp/webhdfs/webhdfs-test.txt?op=CREATE&user.name=hdfs&overwrite=false"
HTTP/1.1 100 Continue
HTTP/1.1 201 Created
Content-Type: application/octet-stream
Location: webhdfs://0.0.0.0:50070/tmp/webhdfs/webhdfs-test.txt
Content-Length: 0
Server: Jetty(6.1.26)

Software Versions used:
Logstash: logstash:6.7.0 (deployed through kubernetes)
Hadoop: 2.7.3 (deployed as single node cluster on ubuntu 16.04 machine)

Can any body please suggest, What am I doing wrong ??

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.