hello.
it seems sincedb updating does not occur until TAIL reaches the end of the file.
with this configuration
input {
file {
path => "/home1/.logstash/json_input/deco.json.*"
codec => "json"
start_position => "beginning"
add_field => ["[@metadata][input_id]", "deco_json"]
sincedb_path => "/home1/.logstash/sincedb/deco_json.db2"
}
I daily move yesterday's log file into "/home1/.logstash/json_input/deco.json.YYYY.MM.DD"
It's size is approximately 1GB. and usually take 20~30minutes to process all line of the file.
when after yesterday's log file moved into logstash input directory, it is well tailed and out processed to elasticsearch.
one difficult thing I have with this situation,
If logstash or elasticsearch got break while process some line of the large file,
logstash have no sincedb position data, because it didn't reached EOF yet, sincedb position never wrote.
also this configuration have no effect.
sincedb_write_interval
in that case I don't know from which line should I re process the log data that didn't processed.
so, no way to failover.
any advice?
is this expected behavior?
I am using logstash 1.5.x
thanks!