Logstash: Output fails

I have two output, kafka and elasticsearch.

I want to have a fail safe with my setup. If one of them stops, the other one can still index a data.

I tried to stop my elasticsearch while the kafka is still running and I found out that the logstash cannot write also in kafka.

Is there any workaround for this?

TIA :smiley:

What's the desired behavior if e.g. ES stops accepting documents? Drop the events on the floor? Queue them somewhere?

If E.S. stops, logstash should continue to index the data to kafka.

And ES will lose the events that were processed while it was down?

Yes. Events will not get into ES.

That's unusual. I don't think the elasticsearch output plugin is capable of dropping events when ES is down so it takes a more complicated setup. I'm actually not sure how that would be done. It's a lot easier to construct something where the elasticsearch output would pick up the events it has missed; just have two Kafka consumers in different Logstash pipelines. They'll consume from Kafka independently and won't affect each other.

I just want to test my environment. Since i have two outputs(es and kafka), i want to know if what will happen if one of the two fails.

But you already know the answer to that question. One output with a problem will halt all other outputs in the same pipeline.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.