Logstash not executing after pipeline main started

I get stuck in this situation for logstash not executing after pipeline main started.

Settings: Default pipeline workers: 4
Pipeline main started

I have checked

  1. I have "touch" the .csv so the file last modify date is less tahn one day
  2. I have added sincedb_path => "/dev/null" in the input {file{..}} session
  3. I have deleted the .sincedb_.. in logstash directory.
  4. I found no extra line in my .csv file

When I try to run the configuration with --debug it keeps "Flushing buffer at interval", and but I fail to get the index in kibana.

Thank you so much.

Ignore the "Flushing buffer at interval" noise and look for anything related to the input file. If you post the debug log somewhere (e.g. in a gist) we can help.

New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["localhost:9200"], :level=>:info, :file=>"logstash/outputs/elasticsearch/common.rb", :line=>"19", :method=>"register"}
Pipeline started {:level=>:info, :file=>"logstash/pipeline.rb", :line=>"109", :method=>"run"}

Logstash startup completed

Flushing buffer at interval {:instance=>"#<LogStash::Outputs::Elasticsearch::Buffer:0x5d43ff72 @ operations_mutex=#Mutex:0x53840aaa, @ max_size=500, @ operations_lock=#Java::JavaUtilConcurrentLocks::ReentrantLock:0x57bb9402, @submit_proc=#Proc:0x5043c5db@/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:55, @ logger=#<Cabin::Channel:0x23615423 @ metrics=#<Cabin::Metrics:0x2871cfc9 @metrics_lock=#Mutex:0x7296ffde, @metrics={}, @channel=#<Cabin::Channel:0x23615423 ...>>, @ subscriber_lock=#Mutex:0x3c35814a, @level=:debug, @ subscribers={12590=>#<Cabin::Outputs::IO:0x33758056 @ io=#<IO:fd 1>, @lock=#Mutex:0x76d8e1e9>}, @data={}>, @ last_flush=2016-11-01 14:58:40 +0800, @flush_interval=1, @ stopping=#Concurrent::AtomicBoolean:0x66f711a9, @ buffer=, @flush_thread=#<Thread:0x138b06d6 run>>", :interval=>1, :level=>:info, :file=>"logstash/outputs/elasticsearch/buffer.rb", :line=>"90", :method=>"interval_flush"}

Then it keeps looping in the flushing buffer at interval.

That's not the whole debug log.

I strongly recommend that you don't enable any elasticsearch outputs until you've confirmed that data arrives to a stdout { codec => rubydebug } output.

_globbed_files: /my/directory/\w+.csv: glob is: {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"346", :method=>"_globbed_files"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
Pushing flush onto pipeline {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}

is that the problem?
since i found it keeps looping among these lines.

Additionally, if I #sincedb_path => "/dev/null" I found .sincedb is created.
While it has nothing to pop after >Settings: Default pipeline workers: 4 >Pipeline main started

I have tried out stdout {codec => rubydebug }
still couldn't figure out the problem. :frowning:

So you have "/my/directory/\w+.csv" in your file input's path setting? Regexps aren't supported there, only glob patterns.

Thank you so much !