Hello all.
I have an ELK stack running on a CentOS server. I also have logging satellites running CentOS, and those are sending logs using filebeat. Fine, and works nice.
I'm also trying to set up my Raspberrys for filebeat, and while filebeat is running and looks working, nothing appears into Elastic.
There is a session:
gauntlet:~# /opt/filebeat/filebeat-linux-arm -c /etc/filebeat/filebeat.yml -e -d output
2017/01/03 13:24:12.305137 beat.go:267: INFO Home path: [/opt/filebeat] Config path: [/opt/filebeat] Data path: [/opt/filebeat/data] Logs path: [/opt/filebeat/logs]
2017/01/03 13:24:12.305446 beat.go:177: INFO Setup Beat: filebeat; Version: 6.0.0-alpha1-git1744740
2017/01/03 13:24:12.305505 logp.go:219: INFO Metrics logging every 30s
2017/01/03 13:24:12.306409 output.go:167: INFO Loading template enabled. Reading template file: /opt/filebeat/filebeat.template.json
2017/01/03 13:24:12.334737 output.go:178: INFO Loading template enabled for Elasticsearch 2.x. Reading template file: /opt/filebeat/filebeat.template-es2x.json
2017/01/03 13:24:12.353874 client.go:120: INFO Elasticsearch url: http://elasticsearch-dev:9200
2017/01/03 13:24:12.354158 outputs.go:106: INFO Activated elasticsearch as output plugin.
2017/01/03 13:24:12.355446 publish.go:291: INFO Publisher name: gauntlet
2017/01/03 13:24:12.356439 async.go:63: INFO Flush Interval set to: 1s
2017/01/03 13:24:12.356553 async.go:64: INFO Max Bulk Size set to: 50
2017/01/03 13:24:12.357595 beat.go:207: INFO filebeat start running.
2017/01/03 13:24:12.357981 registrar.go:85: INFO Registry file set to: /opt/filebeat/data/registry
2017/01/03 13:24:12.358282 registrar.go:106: INFO Loading registrar data from /opt/filebeat/data/registry
2017/01/03 13:24:12.361308 registrar.go:131: INFO States Loaded from registrar: 6
2017/01/03 13:24:12.361561 crawler.go:34: INFO Loading Prospectors: 5
2017/01/03 13:24:12.362314 prospector_log.go:57: INFO Prospector with previous states loaded: 1
2017/01/03 13:24:12.363741 prospector_log.go:57: INFO Prospector with previous states loaded: 1
2017/01/03 13:24:12.364536 registrar.go:230: INFO Starting Registrar
2017/01/03 13:24:12.364536 sync.go:41: INFO Start sending events to output
2017/01/03 13:24:12.366173 prospector_log.go:57: INFO Prospector with previous states loaded: 2
2017/01/03 13:24:12.368378 prospector_log.go:57: INFO Prospector with previous states loaded: 1
2017/01/03 13:24:12.368885 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2017/01/03 13:24:12.370058 prospector_log.go:57: INFO Prospector with previous states loaded: 1
2017/01/03 13:24:12.371198 crawler.go:46: INFO Loading Prospectors completed. Number of prospectors: 5
2017/01/03 13:24:12.371672 crawler.go:61: INFO All prospectors are initialised and running with 6 states to persist
2017/01/03 13:24:12.371706 prospector.go:111: INFO Starting prospector of type: log
2017/01/03 13:24:12.371740 prospector.go:111: INFO Starting prospector of type: log
2017/01/03 13:24:12.371806 prospector.go:111: INFO Starting prospector of type: log
2017/01/03 13:24:12.371804 prospector.go:111: INFO Starting prospector of type: log
2017/01/03 13:24:12.371866 prospector.go:111: INFO Starting prospector of type: log
2017/01/03 13:24:12.379202 log.go:84: INFO Harvester started for file: /var/log/auth.log
2017/01/03 13:24:12.379883 log.go:84: INFO Harvester started for file: /var/log/syslog
2017/01/03 13:24:12.383826 log.go:84: INFO Harvester started for file: /var/log/mail.log
2017/01/03 13:24:17.438123 client.go:652: INFO Connected to Elasticsearch version 5.1.1
2017/01/03 13:24:17.438358 output.go:214: INFO Trying to load template for client: http://elasticsearch-dev:9200
2017/01/03 13:24:17.440716 output.go:235: INFO Template already exists and will not be overwritten.
2017/01/03 13:24:17.612737 single.go:150: DBG send completed
2017/01/03 13:24:17.759963 single.go:150: DBG send completed
2017/01/03 13:24:27.503592 single.go:150: DBG send completed
2017/01/03 13:24:37.521485 single.go:150: DBG send completed
2017/01/03 13:24:42.307568 logp.go:230: INFO Non-zero metrics in the last 30s: libbeat.es.call_count.PublishEvents=4 libbeat.es.publish.read_bytes=2107 registar.states.current=6 filebeat.harvester.running=3 libbeat.es.publish.write_bytes=39767 libbeat.publisher.published_events=85 libbeat.es.published_and_acked_events=85 publish.events=94 filebeat.harvester.open_files=3 registrar.writes=3 registrar.states.update=94 filebeat.harvester.started=3
2017/01/03 13:24:42.498645 single.go:150: DBG send completed
2017/01/03 13:24:47.458720 single.go:150: DBG send completed
Here is the Yaml:
gauntlet:~# cat /etc/filebeat/filebeat.yml
filebeat:
prospectors:
-
paths:
- /var/log/auth.log
input_type: log
document_type: auth
scan_frequency: 1s
-
paths:
- /var/log/apache2/access.log
input_type: log
document_type: apache_access
scan_frequency: 1s
-
paths:
- /var/log/apache2/error.log
input_type: log
document_type: apache_error
scan_frequency: 1s
-
paths:
- /var/log/mail.log
input_type: log
document_type: mail
scan_frequency: 1s
- paths:
- /var/log/syslog
input_type: log
document_type: syslog
scan_frequency: 5s
output:
elasticsearch:
hosts: ["elasticsearch-dev:9200"]
index: "filebeat"
# logstash:
# # The Logstash hosts
# hosts: ["elasticsearch-dev:5044"]
# index: "filebeat"
What might be wrong in my setup for Raspberry? Like I said, the CentOSes work fine.