Pulling logs from remote machine

How can I configure logstash to read from a particular log file on a remote location of a server?

In other words, how can I pull logs from remote machines?

Please provide me with couple of options..

You can use logstash or filebeat to read log files off of servers and send them to elastic.

Thanks!

To be more specific, I have a hadoop HDFS system that is summarizing the logs...so i have to get those logs onto elastic search and display it on the kibana dashboard

What is the best approach to that?

How are the logs landing on HDFS? Could you run filebeat or logstash on the log source? Other than that, you would probably need to mount your filesystem using something like FUSE and read them off that way. I have no idea if that would actually work, though. I feel like putting things into HDFS and then pulling them out would be bad.
https://wiki.apache.org/hadoop/MountableHDFS

Logstash contains a hadoop output plugin, but I'm not aware of any hadoop input plugin.

We have a hadoop connector/client you could use to import them into ES as well, but you'd probably skip going through LS if that were the case.

Is the Hadoop client/connector capable of reading/pulling logs from HDFS?

Yep - https://www.elastic.co/guide/en/elasticsearch/hadoop/current/index.html