Logstash Flush pipeline to output

Hi All,

I am using logstash 5.4.2 Persistence Queue. Where I have config file to get input throw JDBC and do some transformations and storing output to mongo db. But when I run this in logstash it inserts only few records let us say 6000 where as the actual output should be 300000 records. And main pipeline gets shutdown. When I see the page file in data folder has more number of records. How to flush the data into output without pipeline before or after shutdown. My logstash persistence queue setting as follows.
pipeline.workers: 2
pipeline.output.workers: 1
pipeline.batch.size: 50
pipeline.batch.delay: 5
pipeline.unsafe_shutdown: false
config.test_and_exit: false
config.reload.automatic: false
queue.type: persisted
queue.page_capacity: 1gb
queue.max_events: 0
queue.max_bytes: 4gb
queue.checkpoint.acks: 1024
queue.checkpoint.writes: 1024
queue.checkpoint.interval: 1000

Is there any way to flush all data from persistence queue to output during main pipeline shutdown or anyway of workaround to handle this issue?

Thanks in Advance!

And main pipeline gets shutdown.

I'd expect Logstash to log additional details about why it shuts down.

Hi Magnusbaeck,

the logstash comes to dos prompt and it is not insert all 300000 records and only it is inserting 6000 records only. How to run it again to insert the remaining 294000 records with the same config file

Hi Magnusbaeck,

It only inserts 6000 records and back to prompt(logstash closed execution). And it has remaining records in its page file in data folder of logstash. How to push the remaining records into mongodb.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.