Hi guyboertje,
Sorry , please check below , and observed that "[TRACE][filewatch.discoverer ] discover_files {"count"=>0}". But I do double check that the log file exist in the path and there is one event in the file which is json format.
[2019-01-19T08:22:00,158][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2019-01-19T08:22:00,898][TRACE][logstash.inputs.file ] Registering file input {:path=>["C:\elkstack\elasticsearch-6.5.1\logs\app.log"]}
[2019-01-19T08:22:01,023][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x1f520e7c run>"}
[2019-01-19T08:22:01,082][TRACE][logstash.agent ] Converge results {:success=>true, :failed_actions=>, :successful_actions=>["id: main, action_type: LogStash::PipelineAction::Create"]}
[2019-01-19T08:22:01,164][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-01-19T08:22:01,202][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-01-19T08:22:01,229][DEBUG][logstash.agent ] Starting puma
[2019-01-19T08:22:01,256][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2019-01-19T08:22:01,295][TRACE][filewatch.sincedbcollection] open: reading from null
[2019-01-19T08:22:01,308][TRACE][filewatch.sincedbcollection] open: count of keys read: 0
[2019-01-19T08:22:01,367][DEBUG][logstash.api.service ] [api-service] start
[2019-01-19T08:22:01,390][TRACE][filewatch.discoverer ] discover_files {"count"=>0}
[2019-01-19T08:22:01,735][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-01-19T08:22:03,997][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-01-19T08:22:04,491][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-01-19T08:22:04,493][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-01-19T08:22:06,097][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x1f520e7c sleep>"}
[2019-01-19T08:22:09,002][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-01-19T08:22:09,514][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-01-19T08:22:09,516][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-01-19T08:22:11,107][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x1f520e7c sleep>"}
[2019-01-19T08:22:14,007][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-01-19T08:22:14,534][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-01-19T08:22:14,536][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-01-19T08:22:15,537][TRACE][filewatch.discoverer ] discover_files {"count"=>0}
[2019-01-19T08:22:16,110][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x1f520e7c sleep>"}
[2019-01-19T08:22:19,011][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-01-19T08:22:19,551][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-01-19T08:22:19,552][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-01-19T08:22:21,111][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x1f520e7c sleep>"}
[2019-01-19T08:22:24,018][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-01-19T08:22:24,572][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-01-19T08:22:24,575][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-01-19T08:22:26,112][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x1f520e7c sleep>"}
[2019-01-19T08:22:28,462][WARN ][logstash.runner ] SIGINT received. Shutting down.
[2019-01-19T08:22:28,495][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2019-01-19T08:22:28,520][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2019-01-19T08:22:28,524][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2019-01-19T08:22:28,526][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2019-01-19T08:22:28,609][DEBUG][logstash.agent ] Shutting down all pipelines {:pipelines_count=>1}
[2019-01-19T08:22:28,623][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>1}
[2019-01-19T08:22:28,632][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Stop/pipeline_id:main}
[2019-01-19T08:22:28,655][DEBUG][logstash.pipeline ] Stopping inputs {:pipeline_id=>"main", :thread=>"#<Thread:0x1f520e7c sleep>"}
[2019-01-19T08:22:28,662][DEBUG][logstash.inputs.file ] Stopping {:plugin=>"LogStash::Inputs::File"}
[2019-01-19T08:22:28,726][INFO ][filewatch.observingtail ] QUIT - closing all files and shutting down.
[2019-01-19T08:22:28,732][DEBUG][logstash.pipeline ] Stopped inputs {:pipeline_id=>"main", :thread=>"#<Thread:0x1f520e7c sleep>"}
[2019-01-19T08:22:29,588][TRACE][filewatch.sincedbcollection] caller requested sincedb write (tail mode subscribe complete - shutting down)
[2019-01-19T08:22:29,599][TRACE][filewatch.sincedbcollection] sincedb_write: to: null