We are currently using GELF to transfer logs from docker containers to RabbitMQ via logstash. Messages are then read from RabbitMQ to Elasticsearch using another logstash instance.
I am worried that logstash/GELF may drop messages if it losts connection to RabbitMQ cluster. Would it be better to configure docker to write logs to files and then read these files using Beats? Or are there other ways to guarantee messages safety in case of sudden network failure?
Also, beats doesn't directly support RabbitMQ so we should switch to Kafka or Redis. But if I'm already storing messages in files, couldn't I just send them directly to Elasticsearch with Beats?