Logstash is not processing files from input plugin Azure Storage blob, even though new files are available to process

I have been using Azure Storage Input Plugin , for json files. After an improper shutdown of pipeline, the restart of the logstash process is not picking up the files from Azure Storage blob, even there are so many pending files to process.

Logstash config is as below

input
{
azureblob
{
storage_account_name => "aggregatesdata"
storage_access_key => "key"
container => "custommetric"
}
}
output
{
elasticsearch {
action => "index"
hosts => ["10.158.36.199"]
codec => json
index => "it_custommetric"
document_type => "dailyaggregate"
document_id => "%{dataid}"
}
}

That doesn't look like a standard input plugin, where did you find it?

Got from the link below

Actually the problem is not with the plug-in. Logstash is showing the following message continuously. Can you please help to resolve this issue. This seems to be something related to improper shutdown of earlier process?

If yes, how can I resolve this. This message are showing up, even after the system restart.

{:timestamp=>"2016-11-16T11:30:48.160000+0000", :message=>#<LogStash::PipelineReporter::Snapshot:0x4e18b8e6 @data={:events_filtered=>0, :events_consumed=>0, :worker_count=>2, :inflight_count=>0, :worker_states=>[{:status=>"sleep", :alive=>true, :index=>0, :inflight_count=>0}, {:status=>"sleep", :alive=>true, :index=>1, :inflight_count=>0}], :output_info=>[{:type=>"elasticsearch", :config=>{"action"=>"index", "hosts"=>["10.158.36.220"], "codec"=>"json", "index"=>"it_customevent", "document_type"=>"dailyaggregate", "document_id"=>"%{dataid}", "ALLOW_ENV"=>false}, :is_multi_worker=>false, :events_received=>0, :workers=><Java::JavaUtilConcurrent::CopyOnWriteArrayList:2082101559 [<LogStash::Outputs::ElasticSearch action=>"index", hosts=>["10.158.36.220"], codec=><LogStash::Codecs::JSON charset=>"UTF-8">, index=>"it_customevent", document_type=>"dailyaggregate", document_id=>"%{dataid}", workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, flush_size=>500, idle_flush_time=>1, doc_as_upsert=>false, max_retries=>3, script_type=>"inline", script_var_name=>"event", scripted_upsert=>false, retry_max_interval=>2, retry_max_items=>500, retry_on_conflict=>1, ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5>]>, :busy_workers=>0}], :thread_info=>[{"thread_id"=>19, "name"=>"[main]<azureblob", "plugin"=>nil, "backtrace"=>............ms/logstash-input-azureblob-0.9.5/lib/logstash/inputs/azureblob.rb:39:in list_blob_names'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-input-azureblob-0.9.5/lib/logstash/inputs/azureblob.rb:37:inloop'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-input-azureblob-0.9.5/lib/logstash/inputs/azureblob.rb:37:in list_blob_names'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-input-azureblob-0.9.5/lib/logstash/inputs/azureblob.rb:73:inprocess'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-input-azureblob-0.9.5/lib/logstash/inputs/azureblob.rb:89:in run'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:342:ininputworker'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:336:in start_input'"], "blocked_on"=>nil, "status"=>"run", "current_call"=>"[...]/vendor/bundle/jruby/1.9/gems/nokogiri-1.6.8-java/lib/nokogiri/xml/searchable.rb:165:inevaluate'"}, {"thread_id"=>23, "name"=>"[main]>worker0", "plugin"=>nil, "backtrace"=>["[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:309:in synchronize'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:309:ininflight_batches_synchronize'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:234:in worker_loop'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:201:instart_workers'"], "blocked_on"=>nil, "status"=>"sleep", "current_call"=>"[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:309:in synchronize'"}, {"thread_id"=>24, "name"=>"[main]>worker1", "plugin"=>nil, "backtrace"=>["[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:309:insynchronize'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:309:in inflight_batches_synchronize'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:234:inworker_loop'", "[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:201:in start_workers'"], "blocked_on"=>nil, "status"=>"sleep", "current_call"=>"[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:309:insynchronize'"}], :stalling_threads_info=>[{"thread_id"=>19, "name"=>"[main]<azureblob", "plugin"=>nil, "current_call"=>"[...]/vendor/bundle/jruby/1.9/gems/nokogiri-1.6.8-java/lib/nokogiri/xml/searchable.rb:165:in evaluate'"}, {"thread_id"=>23, "name"=>"[main]>worker0", "plugin"=>nil, "current_call"=>"[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:309:insynchronize'"}, {"thread_id"=>24, "name"=>"[main]>worker1", "plugin"=>nil, "current_call"=>"[...]/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:309:in `synchronize'"}]}>, :level=>:warn}
{:timestamp=>"2016-11-16T11:30:48.160000+0000", :message=>"Forcefully quitting logstash..", :level=>:fatal}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.