Hi everybody,
i installed elk stack today and configured everything.. to cut long story short: filebeat not forwarding logs as i see it
i have 3 servers:
xx.xx.xx.233 - elasticsearch v 5.1.2 -
[elk@elasticsearch ~]$ sudo curl -X GET 'http://xx.xx.xx.233:9200' {
"name" : "elastic1",
"cluster_name" : "elasticluster",
"cluster_uuid" : "hffEWmKbTzKRerPawQUYoQ",
"version" : {
"number" : "5.1.2",
"build_hash" : "c8c4c16",
"build_date" : "2017-01-11T20:18:39.146Z",
"build_snapshot" : false,
"lucene_version" : "6.3.0"
},
"tagline" : "You Know, for Search"
}
xx.xx.xx.232 - kibana v 5.1.2 - site is working
xx.xx.xx.231 - logstash v 5.1.2 + filebeat v 5.1.2 -
[elk@logstash ~]$ sudo systemctl status logstash
● logstash.service - logstash
Loaded: loaded (/etc/systemd/system/logstash.service; enabled; vendor preset: disabled)
Active: active (running) since Mon 2017-01-30 15:42:08 IST; 1min 23s ago
file beat status
[elk@logstash ~]$ sudo systemctl status filebeat
● filebeat.service - filebeat
Loaded: loaded (/usr/lib/systemd/system/filebeat.service; enabled; vendor preset: disabled)
Active: active (running) since Mon 2017-01-30 15:44:43 IST; 5s ago
filebeat yml
[elk@logstash ~]$ cat /etc/filebeat/filebeat.yml
- input_type: log
paths:
- /var/log/messages
- /var/log/secure
output.logstash:
hosts: ["xx.xx.xx.231:5044"]
ssl.certificate_authorities: ["/etc/pki/tls/certs/logstash-forwarder.crt"]
logstash conf
[elk@logstash conf.d]$ cat 01-beats-filter.conf
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
input
[elk@logstash conf.d]$ cat 01-beats-input.conf
input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
output
[elk@logstash conf.d]$ cat 01-beats-output.conf
output {
elasticsearch {
hosts => ["xx.xx.xx.233:9200"]
sniffing => true
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
and here is the problem :
and heres a config test:
> [elk@logstash bin]$ sudo ./filebeat -e -c /etc/filebeat/filebeat.yml
> 2017/01/30 14:26:26.456427 beat.go:267: INFO Home path: [/usr/share/filebeat/bin] Config path: [/usr/share/filebeat/bin] Data path: [/usr/share/filebeat/bin/data] Logs path: [/usr/share/filebeat/bin/logs]
> 2017/01/30 14:26:26.456457 beat.go:177: INFO Setup Beat: filebeat; Version: 5.1.2
> 2017/01/30 14:26:26.456601 logp.go:219: INFO Metrics logging every 30s
> 2017/01/30 14:26:26.457359 logstash.go:90: INFO Max Retries set to: 3
> 2017/01/30 14:26:26.457422 outputs.go:106: INFO Activated logstash as output plugin.
> 2017/01/30 14:26:26.457536 publish.go:291: INFO Publisher name: logstash
> 2017/01/30 14:26:26.457747 async.go:63: INFO Flush Interval set to: 1s
> 2017/01/30 14:26:26.457759 async.go:64: INFO Max Bulk Size set to: 2048
> 2017/01/30 14:26:26.457894 beat.go:207: INFO filebeat start running.
> 2017/01/30 14:26:26.458078 registrar.go:85: INFO Registry file set to: /usr/share/filebeat/bin/data/registry
> 2017/01/30 14:26:26.458106 registrar.go:106: INFO Loading registrar data from /usr/share/filebeat/bin/data/registry
> 2017/01/30 14:26:26.479380 registrar.go:123: INFO States Loaded from registrar: 0
> 2017/01/30 14:26:26.479433 crawler.go:34: INFO Loading Prospectors: 1
> 2017/01/30 14:26:26.479484 registrar.go:236: INFO Starting Registrar
> 2017/01/30 14:26:26.479485 prospector_log.go:57: INFO Prospector with previous states loaded: 0
> 2017/01/30 14:26:26.479554 sync.go:41: INFO Start sending events to output
> 2017/01/30 14:26:26.479595 crawler.go:46: INFO Loading Prospectors completed. Number of prospectors: 1
> 2017/01/30 14:26:26.479609 crawler.go:61: INFO All prospectors are initialised and running with 0 states to persist
> 2017/01/30 14:26:26.479631 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
> 2017/01/30 14:26:26.479660 prospector.go:111: INFO Starting prospector of type: log
> 2017/01/30 14:26:26.480251 log.go:84: INFO Harvester started for file: /var/log/secure
> 2017/01/30 14:26:26.480253 log.go:84: INFO Harvester started for file: /var/log/messages
> 2017/01/30 14:26:56.456824 logp.go:230: INFO Non-zero metrics in the last 30s: libbeat.publisher.published_events=2046 libbeat.logstash.publish.write_bytes=132 filebeat.harvester.running=2 filebeat.harvester.started=2 filebeat.harvester.open_files=2
> 2017/01/30 14:26:56.502738 single.go:140: ERR Connecting error publishing events (retrying): read tcp 172.16.50.231:35986->172.16.50.231:5044: i/o timeout
i was looking for hours on guides and tried different configs but with no luck.. hope you could help me ?