Logstash sincedb file not getting updated in ELK 6.8.0

Hi,

recently we have migrated to ELK 6.8.0 and while ingesting data via logstash we are using
sincedb_path => "/export/vendor.sincedb"

in input plugin inside the file component.

This file was updated regularly after certain interval while data ingestion process is running in earlier ELK 5.5.x.
But since we moved to newer version of ELK 6.8.0, this feature is not working.

Due to this our further processing was depended on it and it is currently failing.

Can you please suggest here how logstash can update SinceDb file after certain interval.

What does the file input configuration look like? If it isn't doing what you expect you could enable '--log.level trace' on the command line and see what filewatch is doing.

Here is my file input configuration -
file{
path => "/uat/home/source/vendor/working/vendor.txt"
type =>"VENDOR_DATA"
start_position => "beginning"
sincedb_path => "/uat/home/sincedb/vendor_type.sincedb"
sincedb_write_interval => 10
}
}

Please find below logs -

[2019-07-03T14:19:14,814][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
/uat/home/installation/elk/logstash-6.8.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.10/lib/filewatch/sincedb_collection.rb:22:in initialize' /uat/home/installation/elk/logstash-6.8.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.10/lib/filewatch/observing_base.rb:62:in build_watch_and_dependencies'
/uat/home/installation/elk/logstash-6.8.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.10/lib/filewatch/observing_base.rb:56:in `initialize'
[2019-07-03T14:20:40,303][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-07-03T14:31:09,712][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-07-03T14:31:09,720][TRACE][filewatch.sincedbcollection] open: reading from /uat/home/sincedb/vendor_type.sincedb
[2019-07-03T14:31:09,720][TRACE][filewatch.sincedbcollection] open: count of keys read: 0
[2019-07-03T14:31:09,724][TRACE][filewatch.discoverer ] discover_files {"count"=>1}
[2019-07-03T14:31:09,730][TRACE][filewatch.discoverer ] discover_files handling: {"new discovery"=>true, "watched_file details"=>"<FileWatch::WatchedFile: @filename='vendor.txt', @state='watched', @recent_states='[:watched]', @bytes_read='0', @bytes_unread='0', current_size='719516801', last_stat_size='719516801', file_open?='false', @initial=true, @sincedb_key='1073743100 0 64782'>"}
[2019-07-03T14:31:09,730][TRACE][filewatch.sincedbcollection] associate: finding {"inode"=>"1073743100", "path"=>"/uat/home/source/vendor/working/vendor.txt"}
[2019-07-03T14:31:09,731][TRACE][filewatch.sincedbcollection] associate: unmatched
[2019-07-03T14:31:09,738][TRACE][filewatch.tailmode.processor] Delayed Delete processing
[2019-07-03T14:31:09,738][TRACE][filewatch.tailmode.processor] Watched + Active restat processing
[2019-07-03T14:31:09,739][TRACE][filewatch.tailmode.processor] Rotation In Progress processing
[2019-07-03T14:31:09,739][TRACE][filewatch.tailmode.processor] Watched processing
[2019-07-03T14:31:09,739][TRACE][filewatch.tailmode.handlers.createinitial] handling: avox_cftc.txt
[2019-07-03T14:31:09,740][TRACE][filewatch.tailmode.handlers.createinitial] opening vendor.txt
[2019-07-03T14:31:09,740][TRACE][filewatch.tailmode.handlers.createinitial] handle_specifically opened file handle: 239, path: vendor.txt
[2019-07-03T14:31:09,741][TRACE][filewatch.tailmode.handlers.createinitial] add_new_value_sincedb_collection {"position"=>0, "watched_file details"=>"<FileWatch::WatchedFile: @filename=vendor.txt', @state='active', @recent_states='[:watched, :watched]', @bytes_read='0', @bytes_unread='719516801', current_size='719516801', last_stat_size='719516801', file_open?='true', @initial=true, @sincedb_key='1073743100 0 64782'>"}
[2019-07-03T14:31:09,741][TRACE][filewatch.tailmode.processor] Active - file grew: vendor.txt: new size is 719516801, bytes read 0
[2019-07-03T14:31:09,742][TRACE][filewatch.tailmode.handlers.grow] handling: vendor.txt
[2019-07-03T14:31:09,742][TRACE][filewatch.tailmode.handlers.grow] reading... {"iterations"=>21957, "amount"=>32768, "filename"=>"vendor.txt"}
[2019-07-03T14:31:09,742][DEBUG][filewatch.tailmode.handlers.grow] read_to_eof: get chunk
[2019-07-03T14:31:09,863][DEBUG][filewatch.tailmode.handlers.grow] read_to_eof: get chunk

Currently I am quite stuck, can you check the input configuration and shared logs. let me know, if you need any further information

That suggests that there are no line terminators in the file. It knows it needs to read 719 MB, and it is trying to do so, it is just not finding any lines in the file.

But same file is perfectly with Logstash (ELK 5.5.x), and we are getting expected output in sincedb file. Also in these files, the line separator is already present.

When I use below command to check the number of lines in file -
wc -l vendor.txt

Then I am getting value as "1834549". So that means file has line separator.

Hi,

Can we get more pointer to drill down this issue and find out the root cause of this issue.

Even i am able to ingest all records into elasticsearch. And all records are ok.

Only problem is with sincedb file is not updating and even file size of sincedb file is 0kb.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.