Is there a way to dump data in a file if logstash fails to dump the data in elasticsearch for a particular run

I am using Logstash to dump my data from Postgres to Elasticsearch. So the input is JDBC that is fetching data from Postgres using tracking_column and let's say for one run it is taking 1000 records from Postgres

Output is Elasticsearch and it is dumping those 1000 records in Elasticsearch. So if there is an error while dumping data to Elasticsearch logstash got some error lets Elasticsearch node is unreachable due to some reason then logstash is just incrementing to the next run.

I want to know is there some way I can rerun the same run if there is some error or at least dump the data into some file so that I can run it manually afterwards.

You could use the Dead letter queue for this:

This automatically dumps events that were not successfully sent to ES into a file and you can then use a special input to do something with these events (e.g. write them into a file or sent them to someone via mail).

1 Like

Thanks, this is what I was looking for.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.