Here is what I would have expected. I configure a file input reading two files. Then I append "Foo" to one of them. I "egrep 'file|received' | grep -v configpathloader" in the output to get
[...]
[2019-01-29T11:27:53,702][DEBUG][logstash.inputs.file ] config LogStash::Inputs::File/@file_sort_direction = "asc"
[2019-01-29T11:27:54,173][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/tmp/logstash-data/29396/plugins/inputs/file/.sincedb_1042830aa90348ed174dc186faffbb0a", :path=>["/tmp/foo/a", "/tmp/foo/b"]}
[2019-01-29T11:27:54,316][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-01-29T11:28:08,038][DEBUG][filewatch.tailmode.handlers.grow] read_to_eof: get chunk
[2019-01-29T11:28:08,060][DEBUG][logstash.inputs.file ] Received line {:path=>"/tmp/foo/a", :text=>"Foo"}
[2019-01-29T11:28:08,215][DEBUG][filewatch.sincedbcollection] writing sincedb (delta since last write = 1548779288)
[2019-01-29T11:28:08,308][DEBUG][logstash.pipeline ] filter received {"event"=>{"path"=>"/tmp/foo/a", "message"=>"Foo", "@timestamp"=>2019-01-29T16:28:08.180Z, [...]}}
[2019-01-29T11:28:08,309][DEBUG][logstash.pipeline ] output received {"event"=>{"path"=>"/tmp/foo/a", "message"=>"Foo", "@timestamp"=>2019-01-29T16:28:08.180Z, [...]}}
If you are not seeing that then step the debug level up to trace. Here is an example of it reading a file from the beginning
[2019-01-29T11:44:36,567][TRACE][filewatch.tailmode.handlers.createinitial] handling: a
[2019-01-29T11:44:36,568][TRACE][filewatch.tailmode.handlers.createinitial] opening a
[2019-01-29T11:44:36,569][TRACE][filewatch.tailmode.handlers.createinitial] handle_specifically opened file handle: 86, path: a
[2019-01-29T11:44:36,571][TRACE][filewatch.tailmode.handlers.createinitial] add_new_value_sincedb_collection {"position"=>0, "watched_file details"=>"<FileWatch::WatchedFile: @filename='a', @state='active', @recent_states='[:watched, :watched]', @bytes_read='0', @bytes_unread='8', current_size='8', last_stat_size='8', file_open?='true', @initial=true, @sincedb_key='13268895 0 51714'>"}
[2019-01-29T11:44:36,605][TRACE][filewatch.tailmode.processor] Active - no change {"watched_file"=>"<FileWatch::WatchedFile: @filename='b', @state='active', @recent_states='[:watched, :watched]', @bytes_read='0', @bytes_unread='0', current_size='0', last_stat_size='0', file_open?='true', @initial=false, @sincedb_key='13268896 0 51714'>"}
[2019-01-29T11:44:36,625][TRACE][filewatch.tailmode.processor] Active - file grew: a: new size is 8, bytes read 0
[2019-01-29T11:44:36,627][TRACE][filewatch.tailmode.handlers.grow] handling: a
[2019-01-29T11:44:36,655][TRACE][filewatch.tailmode.handlers.grow] reading... {"iterations"=>1, "amount"=>8, "filename"=>"a"}
[2019-01-29T11:44:36,657][DEBUG][filewatch.tailmode.handlers.grow] read_to_eof: get chunk
[2019-01-29T11:44:36,693][DEBUG][logstash.inputs.file ] Received line {:path=>"/tmp/foo/a", :text=>"Foo"}
[2019-01-29T11:44:36,847][DEBUG][logstash.inputs.file ] Received line {:path=>"/tmp/foo/a", :text=>"Foo"}
[2019-01-29T11:44:36,866][DEBUG][filewatch.sincedbcollection] writing sincedb (delta since last write = 1548780276)
[2019-01-29T11:44:36,870][TRACE][filewatch.sincedbcollection] sincedb_write: to: /tmp/logstash-data/29561/plugins/inputs/file/.sincedb_1042830aa90348ed174dc186faffbb0a
[2019-01-29T11:44:36,985][DEBUG][logstash.pipeline ] filter received {"event"=>{"path"=>"/tmp/foo/a", "@timestamp"=>2019-01-29T16:44:36.812Z, "message"=>"Foo", [...]}}
[2019-01-29T11:44:36,986][DEBUG][logstash.pipeline ] filter received {"event"=>{"path"=>"/tmp/foo/a", "@timestamp"=>2019-01-29T16:44:36.849Z, "message"=>"Foo", [...]}}
Then about once a second each file monitor says it is still tailing
[2019-01-29T11:45:08,068][TRACE][filewatch.tailmode.processor] Active - no change {"watched_file"=>"<FileWatch::WatchedFile: @filename='b', @state='active', @recent_states='[:watched, :watched]', @bytes_read='0', @bytes_unread='0', current_size='0', last_stat_size='0', file_open?='true', @initial=false, @sincedb_key='13268896 0 51714'>"}
[2019-01-29T11:45:08,069][TRACE][filewatch.tailmode.processor] Active - no change {"watched_file"=>"<FileWatch::WatchedFile: @filename='a', @state='active', @recent_states='[:watched, :watched]', @bytes_read='8', @bytes_unread='0', current_size='8', last_stat_size='8', file_open?='true', @initial=false, @sincedb_key='13268895 0 51714'>"}
And when I append a line I get this
[2019-01-29T11:45:19,087][TRACE][filewatch.tailmode.processor] Active - file grew: a: new size is 12, bytes read 8
[2019-01-29T11:45:19,088][TRACE][filewatch.tailmode.handlers.grow] handling: a
[2019-01-29T11:45:19,090][TRACE][filewatch.tailmode.handlers.grow] reading... {"iterations"=>1, "amount"=>4, "filename"=>"a"}
[2019-01-29T11:45:19,090][DEBUG][filewatch.tailmode.handlers.grow] read_to_eof: get chunk
[2019-01-29T11:45:19,093][DEBUG][logstash.inputs.file ] Received line {:path=>"/tmp/foo/a", :text=>"Bar"}
[2019-01-29T11:45:19,129][DEBUG][filewatch.sincedbcollection] writing sincedb (delta since last write = 43)
[2019-01-29T11:45:19,129][TRACE][filewatch.sincedbcollection] sincedb_write: to: /tmp/logstash-data/29561/plugins/inputs/file/.sincedb_1042830aa90348ed174dc186faffbb0a
[2019-01-29T11:45:19,241][DEBUG][logstash.pipeline ] filter received {"event"=>{"path"=>"/tmp/foo/a", "@timestamp"=>2019-01-29T16:45:19.125Z, "message"=>"Bar", [...]