I have a scenario where one logstash is already running, and for some reason the machine fails. I want the process to keep going and not wait for the machine to be restarted.It must automatically start taking data from a second logstash which has been taking in data along with the first one simultaneouly all this time.
Also, I want to use only logstash and not filebeat. I am not sure if load balacing is suitable here for just logstash.
Is there any feature in logstash to do this?
Thanks for the quick reply.
We are using JDBC input plugin for data fetching and it is scheduled to run every 1 hour.
So my requriement is that there must be continuous data going in elastic search and it should not stop if for some reason my host machine fails.Basically I want to create failover architecture.
Also, I read about persistent queues, can it be used here or does it only send data only if machine is restarted?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.