Input logs from Azure BLOB using logstash and output to ES

Hi All,

I have been trying to pull the logs of my webapp which is stored in my azure storage account {BLOB}.

I am using the logstash-input-azureblob plugin which i have already installed.

[root@AZEUSELKVM01 ~]# /usr/share/logstash/bin/logstash-plugin list | grep blob
logstash-input-azureblob

I am using the below configuration in my logstash.conf

input {
azureblob {
storage_account_name => "testblob"
storage_access_key => "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
container => "test"
codec => "json"
file_head_bytes => 12
file_tail_bytes => 2
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "blob-%{+YYYY.MM.dd}"
}
}

When I start my logstash , its showing the plugin error.

Could you please help on how to pull the logs that are stored in azure blob using logstash.

PFB the error for your reference:

[2019-03-08T08:58:28,535][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::PluginLoadingError", :message=>"Couldn't find any input plugin named 'azureblob'. Are you sure this is correct? Trying to load the azureblob input plugin resulted in this error: Problems loading the requested plugin named azureblob of type input. Error: TypeError no implicit conversion of nil into String", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:211:in lookup_pipeline_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/plugin.rb:137:inlookup'", "org/logstash/plugins/PluginFactoryExt.java:222:in plugin'", "org/logstash/plugins/PluginFactoryExt.java:181:inplugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:71:in plugin'", "(eval):8:in'", "org/jruby/RubyKernel.java:994:in eval'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:49:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:43:inblock in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94:in block in exclusive'", "org/jruby/ext/thread/Mutex.java:148:insynchronize'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94:in exclusive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:39:inexecute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:327:in `block in converge_state'"]}
[2019-03-08T08:58:29,089][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Can someone update ?? Still facing issue

Why not use the azure_event_hubs plugin?

I don't believe the azure_event_hubs plugin is capable of processing data from an Azure blob storage account? If it can, I'd sure like to know how!

I haven't tried it, but the documentation says " Azure Blob Storage account is an essential part of Azure-to-Logstash configuration" and "A Blob Storage account is highly recommended for use with this plugin, and is likely required for production servers". That suggests to me that it can process data from an Azure Blob Storage account. YMMV.

I've used the Event Hubs plugin quite a bit and am pretty familiar with it but thought maybe I've missed something.

The plugin uses an Azure Blob Storage account only for tracking the progress of processed events from the Event Hub by creating a small file in the blob storage account. This is to avoid duplicate events. It doesn't work as an input unfortunately.

I'm looking for another way to pull logs from an azure blob storrage account as I've also tried the azureblob input plugin but it doesn't seem to be very well supported and has problems with high resource usage.

Thanks anyway!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.