FileBeat not running

Hi, I recently started working with this product on debian 8. I found the book for version 6.0.0, and set it (ELK). The problem is this: I can't run FileBeat to send syslog logstash logs (FileBeat -> logstash -> elastic -> kibana).
What I do for this:

  1. the yml configuration file is below
  2. running filebeat
  3. log file at startup
  4. Screenshot of kibana (data was not uploaded there)

yml
path.data: /var/lib/logstash
path.config: /etc/logstash/conf.d/*.conf
config.reload.automatic: true
config.reload.interval: 3s
http.host: "192.168.10.185"
path.logs: /var/log/logstash

2020-04-02T15:29:55+03:00 INFO Home path: [/usr/share/filebeat/bin] Config path: [/usr/share/filebeat/bin] Data path: [/usr/share/filebeat/bin/data] Logs path: [/usr/share/filebeat/bin/logs]
2020-04-02T15:29:55+03:00 INFO Beat UUID: db2bfc23-cc99-48b6-98d9-4b4a942aa3d1
2020-04-02T15:29:55+03:00 INFO Metrics logging every 30s
2020-04-02T15:29:55+03:00 INFO Setup Beat: filebeat; Version: 6.0.0
2020-04-02T15:29:55+03:00 INFO Beat name: astra
2020-04-02T15:29:55+03:00 ERR  Not loading modules. Module directory not found: /usr/share/filebeat/bin/module
2020-04-02T15:29:55+03:00 INFO filebeat start running.
2020-04-02T15:29:55+03:00 INFO Registry file set to: /usr/share/filebeat/bin/data/registry
2020-04-02T15:29:55+03:00 INFO Loading registrar data from /usr/share/filebeat/bin/data/registry
2020-04-02T15:29:55+03:00 INFO States Loaded from registrar: 12
2020-04-02T15:29:55+03:00 WARN Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2020-04-02T15:29:55+03:00 INFO Loading Prospectors: 1
2020-04-02T15:29:55+03:00 INFO Starting Registrar
2020-04-02T15:29:56+03:00 INFO Starting prospector of type: log; id: 11204088409762598069 
2020-04-02T15:29:56+03:00 WARN BETA: Dynamic config reload is enabled.
2020-04-02T15:29:56+03:00 INFO Loading and starting Prospectors completed. Enabled prospectors: 1
2020-04-02T15:29:56+03:00 INFO Config reloader started
2020-04-02T15:29:56+03:00 INFO Harvester started for file: /var/log/gufw.log
2020-04-02T15:29:56+03:00 INFO Harvester started for file: /var/log/daemon.log
2020-04-02T15:30:01+03:00 ERR  Not loading modules. Module directory not found: /usr/share/filebeat/bin/module
2020-04-02T15:30:01+03:00 INFO Starting 1 runners ...
2020-04-02T15:30:09+03:00 INFO Harvester started for file: /var/log/auth.log
2020-04-02T15:30:09+03:00 INFO Harvester started for file: /var/log/kern.log
2020-04-02T15:30:25+03:00 INFO Non-zero metrics in the last 30s: beat.memstats.gc_next=8965600 beat.memstats.memory_alloc=6087488 beat.memstats.memory_total=54968536 filebeat.events.active=4117 filebeat.events.added=12325 filebeat.events.done=8208 filebeat.harvester.open_files=4 filebeat.harvester.running=4 filebeat.harvester.started=4 libbeat.config.module.running=1 libbeat.config.module.starts=1 libbeat.config.reloads=5 libbeat.output.events.acked=8192 libbeat.output.events.active=4096 libbeat.output.events.batches=6 libbeat.output.events.total=12288 libbeat.output.read.bytes=24 libbeat.output.type=logstash libbeat.output.write.bytes=305862 libbeat.pipeline.clients=1 libbeat.pipeline.events.active=4117 libbeat.pipeline.events.filtered=16 libbeat.pipeline.events.published=12308 libbeat.pipeline.events.retry=2048 libbeat.pipeline.events.total=12325 libbeat.pipeline.queue.acked=8192 registrar.states.current=12 registrar.states.update=8208 registrar.writes=18
2020-04-02T15:30:55+03:00 INFO Non-zero metrics in the last 30s: beat.memstats.gc_next=11606912 beat.memstats.memory_alloc=6149680 beat.memstats.memory_total=81818200 filebeat.events.added=6144 filebeat.events.done=6144 filebeat.harvester.open_files=4 filebeat.harvester.running=4 libbeat.config.module.running=1 libbeat.config.reloads=6 libbeat.output.events.acked=6144 libbeat.output.events.batches=3 libbeat.output.events.total=6144 libbeat.output.read.bytes=18 libbeat.output.write.bytes=151854 libbeat.pipeline.clients=1 libbeat.pipeline.events.active=4117 libbeat.pipeline.events.published=6144 libbeat.pipeline.events.total=6144 libbeat.pipeline.queue.acked=6144 registrar.states.current=12 registrar.states.update=6144 registrar.writes=3

Hey!

Could you share your complete filebeat configuration?

In the meantime let me share a draft one for reference:

filebeat.inputs:
- type: log 
  paths:
    - /var/log/syslog

This part defines from which log file we are targeting to collect logs from.
If start filebeat in debug mode (./filebeat -e -d -"*") I expect to see events in the console.

Then we need to configure the output part of Filebeat:

#----------------------------- Logstash output --------------------------------
output.logstash:
  hosts: ["127.0.0.1:5044"]

Check: is Logstash machine accessible from where Filebeat is running?

Then we need to configure Logstash side:

input {
  beats {
    port => 5044
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "%{[@metadata][beat]}-%{[@metadata][version]}" 
  }
}

Check: are metrics being sent to ES successfully?

This is how things should look like.

Reference:
https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-log.html
https://www.elastic.co/guide/en/beats/filebeat/current/logstash-output.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.