JDBC Input and Scheduled Filter - events being missed

(Steve Earl) #1

Hi All,

Not sure there's much that can be done about this but I thought I'd ask the question.

I'm running the JDBC Input plugin to obtain new events from our database. The input is set to run every 2 minutes.

I'm also using the logstash-filter-elapsed plugin to correlate 'START' and 'END' events from the JDBC data and work out the elapsed time between the two. I have the timeout value of this set to 60 seconds.

Unfortunately, if the JDBC schedule falls between the START event time and the END event time then the two events don't get correlated, as they are actually processed by different executions of the JDBC input plugin.


13:59:58 START event created on DB
14:00:00 JDBC Input plugin executes (and processes START event)
14:00:05 END event created on DB
14:02:00 JDBC Input plugin executes (and processes END event)

From the timeline above, the END event is processed in a different 'run' of the JDBC-input so doesn't get associated with the START event.

Does anyone know if setting the 'timeout' value on the elapsed filter to be greater than the polling interval of the JDBC input would help? I'm not entirely sure if when the JDBC input runs again it starts a new 'instance' of the elapsed filter or uses the same instance as before (in which case it may still be holding details of the START event).

All help appreciated.

(Guy Boertje) #2

As the elapsed filter holds state, I think you will need to have 1 worker to ensure that only 1 elapsed filter instance processes all events.

(Steve Earl) #3


Thanks for the response. I think you're definitely right about only having one worker. Unfortunately I think that works for records picked up in the current run, but not against a start and end events that might be processed in completely different schedules runs of the Logstash input filter.

I'll keep investigating...!


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.