Ensure delivery of events from SQL DB using logstash

Hi,

I am currently using JDBC input in my logstash configuration. I would like to know what is the most simple setup or common approach to ensure that the records extracted from DB are delivered to Elastic? I am leaning towards persistent queues but if I understand correctly, persistent queues only applicable to inputs that uses a request-response protocol.

Regards
Kenneth

If you enable the persistent queue you should be fine.

I am leaning towards persistent queues but if I understand correctly, persistent queues only applicable to inputs that uses a request-response protocol.

What makes you think that?

@magnusbaeck thanks for the reply. i thought jdbc input would not work with persistent.. maybe i was wrong.

however, tried it just now to ingest over 2 million rows but it seems not all rows were sent to elastic. should logstash be processing all events from the queue before shutting down?

Part of the point of the persistent queue is that Logstash won't have to wait for the queue to drain before shutting down since the queue will be available the next time it starts up.

so if I need logstash to process all events at 1 go, i should set queue.drain: true, is that right?

Yes, that's how I interpret the documentation.

Thanks @magnusbaeck. It works well when ingesting around 300K to 400K rows but with around 2 million records, its just slow and when i looked at the monitoring stats for logstash, events received/sent just doubled up.

Hmm. I'd look into the JVM heap situation.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.