Check Docker Filebeat in VM is sending to Docker logstash in another VM

LIke the title says, I have two VM's one has filebeat reading at a location, and the other has Logstash (and the rest of the ElasticSearch stack) on the other VM.

I've configured my filebeat to be:
filebeat.inputs:

  • type: log

    paths:

    • logpath/*.log

output.logstash:
hosts: ["IP of other VM:5044"]

logging:
level: error
to_files: true
files:
path: /tmp
name: fbeats.log
keepfiles: 7
rotateeverybytes: 10485760: log

and my logstash is configured to be:
input {
beats {
port="5044"
}
}
output {
elasticsearch {
hosts => ["IP OF current VM:9200"]
index => "some index"
}
}

However, when I fire up kibana, I am getting that elasticsearch has no data. I believe this is because filebeat is not sending data to the logstash in the other VM or that logstash is not sending data to elastic search (both logstash and elasticsearch are on the same VM)...

I added some logging to the filebeats, but I cannot find its' logs, either in the host VM or within the filebeat docker container.

Any help or direction is greatly appreciated. Thanks!

Hi @jeffery and welcome :slight_smile:

It would help if you could edit your post and preformat the configuration as you find it in your files, you can do it by copying and pasting it, and using the preformatted text button in the toolbar </>.

To know where to find for the filebeat logs, please give some details on how you are running it. If you are running it with systemd logs will be available with journalctl -u filebeat, if you are running it with docker, logs will be available in docker logs, and in other cases logs will be in the /var/log/filebeat directory.
In filebeat logs you will probably find the reason why filebeat is not collecting any log.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.