How to install Filebeat?


(Kartheek Gummaluri) #1

We have two gcloud instances and we need to collect logs of those two instances using filebeat and we have another instance to store the logs and to analyze by kibana ,elasticsearch and logstash.My question can you tell me what to install on these instances to work with Filebeat.

Thanks in advance,
Kartheek Gummaluri


(Magnus Bäck) #2

Have you seen the documentation?

https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-getting-started.html


(Kartheek Gummaluri) #3

yes but there two plugins right I was confused where to install filebeat-input plugin and filebeat plugin.


(Magnus Bäck) #4

Okay, so you're really asking about Logstash. What version of Logstash are you using? Recent versions have the beats input plugin out of the box so you don't need to install anything extra.


(Kartheek Gummaluri) #5

logstash-1.5.6


(Kartheek Gummaluri) #6

Why I am using logstash means if we need to query the message field and message field is analyzed.
And do you suggest the below mentioned query i did is right one or not.So that I can able to query the message fileds.

input {
file {
path => "/var/tmp/querylogs.log"
start_position => beginning
}
}
filter {
grok {
match => { "message"}
}
}
output {
elasticsearch {
protocol => "http"
}
stdout {}
}
output {
elasticsearch {

index => "querylogs"
index_type => "logs"

}
stdout {}
}


(Magnus Bäck) #7

Follow the plugin installation documentation to install logstash-input-beats. If that doesn't work it might be because the beats in put requires Logstash 2.x but I don't think it does. Unless you have any particular reason to stay on Logstash 1.5.x I'd recommend that you upgrade.

Looking at your Logstash configuration, your grok filter doesn't do anything. If you want to parse a message you have to tell Logstash how that should be done. See examples in the documentation. Also, you have two elasticsearch output and two stdout outputs. That hardly makes sense.


(Kartheek Gummaluri) #8

I was mentioned in previous statements that we have two instances so I need to filebeat on both the instances right.Is there any possibility that if we output to elastisearch.How to do message field to un-analyzed so that searching can be done.


(Magnus Bäck) #9

I was mentioned in previous statements that we have two instances so I need to filebeat on both the instances right

Yes.

How to do message field to un-analyzed so that searching can be done.

With the default mappings configured by the index template that ships with Logstash, all string fields have a .raw subfield containing an unanalyzed copy of the field. You can confirm that you can perform the searches you want using those fields. To skip the .raw subfield and make the field itself unanalyzed, make a copy of the index template, modify it, and configure Logstash to use it. This was discussed in a thread in the Logstash group just a few days ago.


(system) #10