I am new to ES and Logstash and I was trying to setup logstash to collect logs from specific directories and forward them to my Elasticsearch instance. My ES node runs on my local machine and the logs are located in different directories on different VMs.
My question is, what is the ideal setup configuration in this case? I was looking at this page:
which discusses how to set up a config file and I am not sure whether I should run a single logstash instance on my local machine and setup filebeats on the VMs or should I have multiple logstash instances (one for each VM) and forward that to my ES node. Is there even a need for filebeats?
Either method works. I prefer doing as little as possible on the leaf nodes and having them send all events to a central Logstash server (or set of servers) that does all the filtering.
If the VMs are small the much bigger overhead of Logstash makes Filebeat a better option.
The path is well specified as I works fine if I just run the logstash and ES in the VM.
I am sure it's due to the fact that I am trying to connect with an Logstash/ES which are not in the same machine.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.