Logstash v 6.5.1
Just noticed that the Logstash file input crashes and creates an error if it can't find the path for a sincedb file specified in the file input section. Would be good if there was a switch to tell file input to ignore such an event and carry on gracefully.
I have been configuring a general input containing possible files that may or may not exist on a server so that I can keep a generic Logstash shipper config to cover many server types. I assume I didn't notice the error before as I had logging turned off but havent verified yet with older versions of Logstash.
I guess if I just pointed to a single sincedb file in a generic directory to cover all files found on the system? Any potential for file contention if sincedb is being used to keep track of multiple file locations?
Or how about a "If File Exists" type condition?
e.g. a config keeps a different sincedb for each file type in that file directory:
file {
type => 'file1'
path => "path1/file1"
sincedb_path => "path1/sincedb"
}
file {
type => 'file2'
path => "path2/file2"
sincedb_path => "path2/sincedb"
}
If path1 or path2 does not exist then file input crashes
[2018-12-04T15:04:30,495][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-12-04T15:04:30,500][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::File path=>["xxxxxxxxxxxxxxxxxxxxxx*.log"], codec=><LogStash::Codecs::Multiline pattern=>"^%{TIMESTAMP_ISO8601}", what=>"previous", id=>"527dbfb2-c171-4942-ad7a-302034d9777f", negate=>true, enable_metric=>true, charset=>"UTF-8", multiline_tag=>"multiline", max_lines=>500, max_bytes=>10485760>, add_field=>{"xxx"=>"xxxx", "ServiceName"=>"xxxxxxxxxx", "Environment"=>"xxxxxx"}, exclude=>["*Errors*"], id=>"d30b34b5e45cfda70dcd0acb34aaf3cec3853935ae078a1d314cb0e579b6baf7", type=>"xxxxxx", sincedb_path=>"xxxxxxxxxxxxx", enable_metric=>true, stat_interval=>1.0, discover_interval=>15, sincedb_write_interval=>15.0, start_position=>"end", delimiter=>"\n", close_older=>3600.0, mode=>"tail", file_completed_action=>"delete", sincedb_clean_after=>1209600.0, file_chunk_size=>32768, file_chunk_count=>140737488355327, file_sort_by=>"last_modified", file_sort_direction=>"asc">
Error: No such file or directory - xxxxxxxxxxxxxxxxxxxxxxx
Exception: Errno::ENOENT
Stack: org/jruby/RubyFile.java:366:in `initialize'
org/jruby/RubyIO.java:1154:in `open'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:1167:in `block in touch'
org/jruby/RubyArray.java:1734:in `each'
uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/fileutils.rb:1161:in `touch'
D:/Elastic/logstash-6.5.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-file-4.1.6/lib/filewatch/sincedb_collection.rb:22:in `initialize'
D:/Elastic/logstash-6.5.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-file-4.1.6/lib/filewatch/observing_base.rb:62:in `build_watch_and_dependencies'
D:/Elastic/logstash-6.5.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-file-4.1.6/lib/filewatch/observing_base.rb:56:in `initialize'
D:/Elastic/logstash-6.5.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-file-4.1.6/lib/logstash/inputs/file.rb:332:in `start_processing'
D:/Elastic/logstash-6.5.1/vendor/bundle/jruby/2.3.0/gems/logstash-input-file-4.1.6/lib/logstash/inputs/file.rb:337:in `run'
D:/Elastic/logstash-6.5.1/logstash-core/lib/logstash/pipeline.rb:409:in `inputworker'
D:/Elastic/logstash-6.5.1/logstash-core/lib/logstash/pipeline.rb:403:in `block in start_input'
[2018-12-04T15:04:31,518][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-12-04T15:04:31,549][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.