I have a scenario where one logstash is already running, and for some reason the machine fails. I want the process to keep going and not wait for the machine to be restarted.It must automatically start taking data from a second logstash which has been taking in data along with the first one simultaneouly all this time.
Also, I want to use only logstash and not filebeat. I am not sure if load balacing is suitable here for just logstash.
Is there any feature in logstash to do this?
Thanks in advance.
No, but if you are sending data to logstash using a network protocol (tcp, http) then you could use a load balancer to implement failover.
Thanks for the quick reply.
We are using JDBC input plugin for data fetching and it is scheduled to run every 1 hour.
So my requriement is that there must be continuous data going in elastic search and it should not stop if for some reason my host machine fails.Basically I want to create failover architecture.
Also, I read about persistent queues, can it be used here or does it only send data only if machine is restarted?
I do not think logstash supports any clustering for inputs, so this cannot be done.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.