I'm encountering a couple issues with logstash, it's probably me not understanding the documentation.
First, I cannot get logstash to start where it left off when filling ES from an Oracle datastore. When it gets about 2 million records in, logstash appears to stop, then restart (with debug logging enabled, it produces no error) and then runs the exact same query over again instead of using a recorded :sql_last_value. I have tried a myriad of configurations in order to try to use specific fields for tracking, clean run and the other associated fields (clean run false with an empty file simply doesn't work, because it's trying to use "Null", apparently, for :sql_last_value) and I'm out of ideas at this point. I can record an arbitrary timestmap in the last_metadata file, but instead of overwritingthe value I manually place in there, it just reads from it repeatedly.
Here's a log snippet from logstash starting up, and for whatever, starting at 0 date which it shouldn't be starting at, based on the configuration:
And here's a log snippet where logstash automagically restarts with no human input, and of course, no logged reason why it would be restarting. And of course, it executes the same query from the beginning of time, as opposed to the timestamp that it should be recording or keeping track of somehow, according to the configuration and documentation:
Clearly I'm doing something completely wrong here but it's not incredibly clear from the jdbc documentation page, and there are no provided examples for this scenario so maybe I added too many options, or not including enough...
Second, my date filter appears to create a _dateparseerror in the current configuration. Apparently, no debug information from the filter is ever generated and I don't see in the documentation where I can put the date filter in a debug or trace logging state - only logstash as a whole (and sql debug to see the query, I'm guessing - which is enabled). When I isolate the date match statement in it's own configuration file, I can echo the timestamp into logstash and logstash will match it. I wrapped the date filter with a conditional to ensure that it can see the "start_ts" field, and it indeed does see it, but apparently the timestamp it's getting is not what is being reported in the log (??) - as the date filter is not matching the timestamp and I'm clearly not matching against the correct format. I would expect the date match to fail when I isolate the date filter, but it doesn't - and takes whatever timestamp string I give it and applies it to the @timestamp field. No such luck when feeding from jdbc, though.
Here's a clip of console from the logs when I isolate the filter:
Here's the logstash yaml config:
Any help is appreciated here. I"m a bit burned out on this as it really should be pretty straight forward, but has been anything but.