How to prevent logs missing

Hi folks,
Please share your experience and tips, how to protect from losing logs, when the Logstash or Elastic falls.

RabbitMQ? Redis?

I have one standalone ELK stack.

Thank you

Rather clasterize and apply adequate cluster policy.

But then, what if RabbitMQ fails?

Logstash has now persistent queues which would probably avoid some of the problems.

What exactly the problem you have or you are thinking that you can have?

ELK need to update more often than rabbitmq ))) I just want to protect from losing my logs when updating EL.

I see.

Kafka is also a good candidate IMHO. RabbitMQ and Redis are nice as well.

Redis is shit. Dont recommend it (esp for newbies)
If Rabbit (or any other CaaS) fails, just restart it (either manually or by script) and you're back on track.

These forums are not the place for that sort of language, so please refrain from using it in future.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.