I am using logstash 5.4.2 Persistence Queue. Where I have config file to get input throw JDBC and do some transformations and storing output to mongo db. But when I run this in logstash it inserts only few records let us say 6000 where as the actual output should be 300000 records. And main pipeline gets shutdown. When I see the page file in data folder has more number of records. How to flush the data into output without pipeline before or after shutdown. My logstash persistence queue setting as follows.
Is there any way to flush all data from persistence queue to output during main pipeline shutdown or anyway of workaround to handle this issue?
Thanks in Advance!