When sending logs from kafka to ES, if ES is down and logstash fails to send to ES, will it continue to consume messages from kafka or does it stop consuming as it fails to send?
Thanks
When sending logs from kafka to ES, if ES is down and logstash fails to send to ES, will it continue to consume messages from kafka or does it stop consuming as it fails to send?
Thanks
logstash has internal queues. Once they are full it will stop consuming.
i am sorry. i did not understand it . i am using kafka input plugin to read from kafka and sending to ES. When what is full it will stop consuming?
Thanks
When the internal queue in logstash is full it will stop reading from kafka.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.