HDFS as Input for Logstash

Hello together,

I am new to the ELK stack and I wonder if one need an extra logstash plugin for integrating logstash with HDFS as an Input. Unfortunately, I am not sure if the plugins are only designed for the output data of logstash and saving them in HDFS. It would be great if you can share some information with me, like how to design the Input part of Logstash in order to take a HDFS directory. Or is it much more easier, like putting the path in the input part?

Thank you very much in advance.

Perhaps you could mount the HDFS with FUSE then just use the file input.

https://wiki.apache.org/hadoop/MountableHDFS

I've never tried mounting an HDFS but I've worked with folks who's worn that it works. :sweat_smile:

GL

You mean I can probably use an NFS mount for getting the directory on my node where logstash is currently installed right?