Logstash output to webhdfs

Need to split the logstash output to hadoop, per minute.

My output looks like that:

 webhdfs {
   host => "127.0.0.1"                 # (required)
   port => 50070                       # (optional, default: 50070)
   path => "/user/logstash/dt=%{+YYYY-MM-dd}/logstash-%{+HH:mm}.log" # (required)
   user => "vphadoop"                       # (required)
   codec => "json"
 }

I am getting the following error:

[2017-06-08T13:22:48,149][ERROR][logstash.outputs.webhdfs ] Max write retries reached. Events will be discarded. Exception: {"RemoteException":{"exception":"InvalidPathException","javaClassName":"org.apache.hadoop.fs.**InvalidPathException**","message":"Invalid path name Invalid file name: /user/logstash/dt=2017-03-01/logstash-13:04.log"}}

When looking on the hadoop hdfs I see:

root@vpelastic:[/var/log]$ hdfs dfs -ls /user
Found 2 items
drwxr-xr-x - vpwrk1 supergroup 0 2017-06-05 11:16 /user/db
drwxrwxrwx - vphadoop supergroup 0 2017-06-01 15:05 /user/logstash

I expect the logstash to create the rest of the directories.

Thanks
Sharon.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.