Elasticsearch is down. What happens to output data from Logstash?

Hi,

This probably is a repeated question. But I would like to clarify this.

Elasticsearch is down suddenly on central server. But Logstash is running on remote server parsing logs.
We started elasticsearch after few hours.

Did we loose any data? If not how Logstash is handling this?

Thanks.

LS will simply stop processing, but keep running.
Then when ES comes back up it will start sending again.

Does LS keep trying to flush the same output to elasticsearch until it is successful?

That's correct.

We verified that no data is lost. This is Great..!
Thanks @warkolm

Hi, We are facing similar issue. But when ES is down, getting Too many open files exception and ultimately our application server crashes.

Is there any fix/configuration like, if output ES is not reachable, do not read any event until notified.

Thanks in advance @warkolm

Please start a new thread, this one is pretty old.

You need to check the number of open files allowed on your server.
You can get that using command
> ulimit -n
By default it would be 1024. You need to increase the number to an higher number say 64K.

That should solve your open files exception issue.