How should i setup logstash to collect from multiple machines?

I am new to ES and Logstash and I was trying to setup logstash to collect logs from specific directories and forward them to my Elasticsearch instance. My ES node runs on my local machine and the logs are located in different directories on different VMs.

My question is, what is the ideal setup configuration in this case? I was looking at this page:

https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html#configuring-geoip-plugin

which discusses how to set up a config file and I am not sure whether I should run a single logstash instance on my local machine and setup filebeats on the VMs or should I have multiple logstash instances (one for each VM) and forward that to my ES node. Is there even a need for filebeats?

Either method works. I prefer doing as little as possible on the leaf nodes and having them send all events to a central Logstash server (or set of servers) that does all the filtering.

If the VMs are small the much bigger overhead of Logstash makes Filebeat a better option.

that makes sense, i'll run a local instance of logstash and use filebeats on the vms. thanks for the advice.

Hey dal, I am trying to do the same configuration.

So I each filebeat on my vms as specified:

filebeat:
prospectors:
-
paths:
- \mypath*.log
document_type: mytype
input_type: log
ignore_older: 0
close_older: 0
fields:
product: order_online
fields_under_root: true
tail_files: false

output:
logstash:
hosts: ["My.Local.Machine.ip.adress:5044"]
shipper:
logging:
files:
rotateeverybytes: 10485760

And then in my local machine I have the following configuration:

input {
beats {
port => 5044
ssl => false

}
}

filter {...}
output {

if "_grokparsefailure"  in [tags] {
    file {
        path => "mypath2/error.log"
    } 
}else{
	elasticsearch{
		hosts => "My.Local.Machine.ip.adress:9200"
		index => "myindexname"
		manage_template => false
		document_type => "mydocname"
	}
	
	output {
		stdout { codec => json }
	}
	
}

}

With this configuration it's collecting from my local machine but not from the VM machines, I think i am missing something, can you please help me ?

Thanks

- \mypath*.log

Are you missing a drive letter here? I also believe you need to quote the whole value, use two backslashes, or use forward slashes.

The path is well specified as I works fine if I just run the logstash and ES in the VM.
I am sure it's due to the fact that I am trying to connect with an Logstash/ES which are not in the same machine.

Well, if Filebeat has problems connecting to Logstash there should be an error message in the Filebeat log.

HI can someone help how to install logstash

I tried

echo "deb http://packages.elastic.co/elasticsearch/2.x/debian stable main" | sudo tee -a /etc/apt/sources.list.d/elasticsearch-2.x.list

echo "deb http://packages.elastic.co/kibana/4.4/debian stable main" | sudo tee -a /etc/apt/sources.list.d/kibana-4.4.x.list

and executed apt-get install logstash

But it was saying unable to locate packages.

Can someone please help me how to fix this issue

@krddy1026, please start a new thread for your completely unrelated question.