Not sure there's much that can be done about this but I thought I'd ask the question.
I'm running the JDBC Input plugin to obtain new events from our database. The input is set to run every 2 minutes.
I'm also using the logstash-filter-elapsed plugin to correlate 'START' and 'END' events from the JDBC data and work out the elapsed time between the two. I have the timeout value of this set to 60 seconds.
Unfortunately, if the JDBC schedule falls between the START event time and the END event time then the two events don't get correlated, as they are actually processed by different executions of the JDBC input plugin.
13:59:58 START event created on DB
14:00:00 JDBC Input plugin executes (and processes START event)
14:00:05 END event created on DB
14:02:00 JDBC Input plugin executes (and processes END event)
From the timeline above, the END event is processed in a different 'run' of the JDBC-input so doesn't get associated with the START event.
Does anyone know if setting the 'timeout' value on the elapsed filter to be greater than the polling interval of the JDBC input would help? I'm not entirely sure if when the JDBC input runs again it starts a new 'instance' of the elapsed filter or uses the same instance as before (in which case it may still be holding details of the START event).
All help appreciated.