Input simple txt file

Hello,

Not sure what I'm missing:

I have setup a docker elk stack and now I'm trying to see if I can ingest a log file locally. I was having trouble with logstash shutting down when introducing a new input. See this thread:
https://discuss.elastic.co/t/pipeline-terminated-pipeline-id-monitoring-logstash-logstash-shutdown/202709

Now I just want to play with some files and test how they will look.

Here is my logstash.conf

input {

       file {
        path => "/opt/docker-elk/input.txt"
        start_position => "beginning"
        sincedb_path => "/dev/null"
      }
}



## Add your filters / logstash plugins configuration here

output {
     elasticsearch {
             hosts => "elasticsearch:9200"
             user => "elastic"
             password => "changeme"
     }

     stdout {
        codec => rubydebug
        }

}

What am I missing? I don't see any activity logged when I add lines to the input.txt

Enable TRACE level logging and see what filewatch has to say.

How do I do that?

Add '--log.level trace' to the command line, or set log.level in logstash.yml, or you may even be able to work it with

curl -XPUT 'localhost:9600/_node/logging?pretty' -H 'Content-Type: application/json' -d'
{
    "logger.filewatch.discoverer" : "TRACE",
    "logger.filewatch.observingtail" : "TRACE",
    "logger.filewatch.sincedbcollection" : "TRACE",
    "logger.filewatch.tailmode.handlers.createinitial" : "TRACE",
    "logger.filewatch.tailmode.handlers.grow" : "TRACE",
    "logger.filewatch.tailmode.processor" : "TRACE"
}
'

If you are going to use that I would start logstash with a file input that reads a non-existent file, then enable tracing, then add a line to the file.