Configure Filebeat for All Logs


(Naseerudeen MEEAJANE) #1

Hello,

I have installed ELK on my server. So now, I ma trying to use and configure Filebeat to send somme informations to my ELK server, but when I configure filebeat from some tutorial, it does not work.

Someone has any advices or tutorial that can Help me.

Thanks a lot


Unable to connect to ELK server through a Client server with Filebeat configured
(Magnus Bäck) #2

Please post your full configuration and make sure you format the configuration as code (there's a toolbar button for it).

Please also post the logs you get when you start Filebeat with -v -d "*".


No data being discovered in Kibana front end
(Steffen Siering) #3

which tutorial have you used?

Filebeat has an official getting started guide.


(Naseerudeen MEEAJANE) #4

Hello, I have followed some tutorial to set my ELK server, but for Filebeat I have used the Official Guide.


Dial tcp (ip number):5044: getsockopt: connection refused
(Naseerudeen MEEAJANE) #5

Hello,

Here is the result when I am trying to start Filebeat :

root@crinforecettevm01:~# /etc/init.d/filebeat start -v -d "*"
2016/01/19 15:18:19.009625 transport.go:125: ERR SSL client failed to connect with: dial tcp 37.59.235.27:5044: getsockopt: connection refused
root@crinforecettevm01:~#

I think i have something wrong with my SSL


(ruflin) #6

Did you verify that your Logstash server is running and working as expected? Can you share your config file?


(Naseerudeen MEEAJANE) #7

Hello,

Yes for sure my logstash server is running.
Here is the config for filebeat:

input {
beats {
port => 5044
type => "logs"
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}

Here is the config for elasticsearch :

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

And here is the confif file :

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

When trying to restart my logstash server, I have this message :

{:timestamp=>"2016-01-21T15:54:26.985000+0100", :message=>"The error reported is: \n Address already in use - bind - Adresse déjà utilisée"}


(ruflin) #8

Can you also share the filebeat config? The above is the Logstash config.

For your error above: It seems like you have one more logstash instance running in the background you should stop first. Probably this LS instance has a different config.


(Naseerudeen MEEAJANE) #10

Hi,

There is just one line in my elasticsearch.yml, it's :

network.host: localhost

All other lines are commented. About logstash, I am sure that no other instance is running. I see anything in my process list.

Thank you for your help


(Magnus Bäck) #11

{:timestamp=>"2016-01-21T15:54:26.985000+0100", :message=>"The error reported is: \n Address already in use - bind - Adresse déjà utilisée"}

Logstash reads all files in the configuration file directory (probably /etc/logstash/conf.d). Make sure you don't have any backup files or similar laying around.


(Naseerudeen MEEAJANE) #12

Hello,
I had some backup files in the folder of logstash. So I move them and now logstash is UP and RUNNING :slight_smile:
When starting Filebeat, no error and some informations appears in Kibana. So good news.

So now, I am going to configure filebeat. I want to get some logs from personal folders then send them to logstash to see them in Kibana.

Do you know How Can I make this ? Also, Do you have any informations about filter ? As now, I have a lot of informations in Kibana

Thanks a lot


(Magnus Bäck) #13

Do you know How Can I make this ? Also, Do you have any informations about filter ? As now, I have a lot of informations in Kibana

If you ask specific questions we can help out. For best results please place Logstash questions in the Logstash category.


(Naseerudeen MEEAJANE) #14

Ok thanks.

About Filebeat, I start it this morning and I see informations in Kibana. But now, no data is return.
Is it normal ?

I don't understand how filebeat is working

Thanks


(Magnus Bäck) #15

You mean no new data is seen in Kibana? Or no data at all? If the latter it sounds like an Elasticsearch problem.

Hint: Your questions are still vague and hard to answer.


(Naseerudeen MEEAJANE) #16

Hello
I mean no new data. Some datas has been sent when I start Filebeat. But for now, no new data.
I don't know if it's a normal process or may be If I have some issues with elasticsearch


(Magnus Bäck) #17

Is Filebeat still running? Have the monitored files actually changed? Are there any clues in its log file? You may need to increase the logging by starting it with -v -d "*".


(Naseerudeen MEEAJANE) #18

Yep file beat is still running. The files monitored ? You mean the log file ? If yes, for sure they have changed.
About "clues in log file", what do you mean for this ? As my english is not perfect and I am not sure to understand it very well.
I restart filebeat with the options -v -d "*"

In the filebeat conf, I don't see some line to configure filebeat log, do you know where I can get log from filebeat ?


(Magnus Bäck) #19

See https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-configuration-details.html#configuration-logging for information about how to configure Filebeat's logging. Once you have some logs you can post it here to get help.


(Naseerudeen MEEAJANE) #20

Thanks for the link.
Will configure this this afternoon and then will post the log file


(Naseerudeen MEEAJANE) #21

Hello, here is the stdout when start Filebeat with the right option :

root@crinforecettevm01:/var/log/mybeat# /etc/init.d/filebeat start -v -d "*"
2016/01/25 10:43:20.043185 beat.go:97: DBG  Initializing output plugins
2016/01/25 10:43:20.043237 geolite.go:24: INFO GeoIP disabled: No paths were set under output.geoip.paths
2016/01/25 10:43:20.058109 client.go:244: DBG  ES Ping(url=http://localhost:9200, timeout=1m30s)

2016/01/25 10:44:23.328609 client.go:249: DBG  Ping request failed with: Head http://localhost:9200: dial tcp [::1]:9200: getsockopt: connection refused
2016/01/25 10:44:23.328652 outputs.go:111: INFO Activated elasticsearch as output plugin.
2016/01/25 10:44:23.607016 outputs.go:111: INFO Activated logstash as output plugin.
2016/01/25 10:44:23.607052 publish.go:198: DBG  create output worker: 0x0, 0xc8200cd0e0
2016/01/25 10:44:23.607092 publish.go:198: DBG  create output worker: 0x0, 0x0
2016/01/25 10:44:23.619181 publish.go:235: DBG  No output is defined to store the topology. The server fields might not be filled.
2016/01/25 10:44:23.619240 publish.go:249: INFO Publisher name: crinforecettevm01
2016/01/25 10:44:23.619531 async.go:95: DBG  create bulk processing worker (interval=1s, bulk size=50)
2016/01/25 10:44:23.619594 async.go:95: DBG  create bulk processing worker (interval=1s, bulk size=200)
2016/01/25 10:44:23.619645 beat.go:107: INFO Init Beat: filebeat; Version: 1.0.1
root@crinforecettevm01:/var/log/mybeat#

About log file, the level is "Debug" and there is a lot of lines. But here is the line that I feel not good :

2016-01-25T11:48:40+01:00 DBG Ping request failed with: Head http://localhost:9200: dial tcp [::1]:9200: getsockopt: connection refused
2016-01-25T11:48:40+01:00 INFO Connecting error publishing events (retrying): Head http://localhost:9200: dial tcp [::1]:9200: getsockopt: connection refused