Pipeline aborted due to error - Invalid argument

I am quite new to the ELK stack, and I'm trying to help take over a config someone else built.
I'm running into an issue recently with one of my azureblob pipelines not working properly.
I've spent a few hours digging around using Google, but haven't found anything that seems to relate to this specific error. My other azureblob pipeline seems fine, so I'm guessing it's an issue with my config, but I have no idea what the issue is.
Installation is Logstash 6.6.0, with the azureblob plugin installed.

Here is the error from the logstash-plain.log:

[2019-08-19T23:49:27,856][ERROR][logstash.pipeline        ] Error registering plugin {:pipeline_id=>"NSG", :plugin=>"<LogStash::Inputs::LogstashInputAzureblob container=>\"insights-logs-networksecuritygroupflowevent\", codec=><LogStash::Codecs::JSON id=>\"json_5be89481-3939-4ff2-9fa1-ff7c3032033b\", enable_metric=>true, charset=>\"UTF-8\">, storage_account_name=>\"azhkstorage\", file_tail_bytes=>2, storage_access_key=>\"*******REDACTED********", id=>\"*******REDACTED*******", file_head_bytes=>12, enable_metric=>true, endpoint=>\"core.windows.net\", registry_path=>\"data/registry\", registry_lease_duration=>15, interval=>30, registry_create_policy=>\"resume\", blob_list_page_size=>100, file_chunk_size_bytes=>4194304>", :error=>"Invalid argument - C:/Windows/system32/config/systemprofile/.gem/specs/api.rubygems.org%443/latest_specs.4.8", :thread=>"#<Thread:0x17e42f41 run>"}
[2019-08-19T23:49:28,764][ERROR][logstash.pipeline        ] Pipeline aborted due to error {:pipeline_id=>"NSG", :exception=>#<Errno::EINVAL: Invalid argument - C:/Windows/system32/config/systemprofile/.gem/specs/api.rubygems.org%443/latest_specs.4.8>, :backtrace=>["org/jruby/RubyFile.java:341:in `flock'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rubygems.rb:898:in `block in write_binary'", "org/jruby/RubyIO.java:1156:in `open'", "org/jruby/RubyKernel.java:306:in `open'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rubygems.rb:896:in `write_binary'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rubygems/remote_fetcher.rb:334:in `cache_update_path'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rubygems/source.rb:194:in `load_specs'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rubygems/spec_fetcher.rb:267:in `tuples_for'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rubygems/spec_fetcher.rb:231:in `block in available_specs'", "org/jruby/RubyArray.java:1734:in `each'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rubygems/source_list.rb:98:in `each_source'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rubygems/spec_fetcher.rb:227:in `available_specs'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rubygems/spec_fetcher.rb:103:in `search_for_dependency'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rubygems/spec_fetcher.rb:167:in `spec_for_dependency'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/rubygems.rb:949:in `latest_spec_for'", "D:/ELK/Logstash/logstash-6.6.0/vendor/bundle/jruby/2.3.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:135:in `register'", "D:/ELK/Logstash/logstash-6.6.0/logstash-core/lib/logstash/pipeline.rb:242:in `register_plugin'", "D:/ELK/Logstash/logstash-6.6.0/logstash-core/lib/logstash/pipeline.rb:253:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "D:/ELK/Logstash/logstash-6.6.0/logstash-core/lib/logstash/pipeline.rb:253:in `register_plugins'", "D:/ELK/Logstash/logstash-6.6.0/logstash-core/lib/logstash/pipeline.rb:396:in `start_inputs'", "D:/ELK/Logstash/logstash-6.6.0/logstash-core/lib/logstash/pipeline.rb:294:in `start_workers'", "D:/ELK/Logstash/logstash-6.6.0/logstash-core/lib/logstash/pipeline.rb:200:in `run'", "D:/ELK/Logstash/logstash-6.6.0/logstash-core/lib/logstash/pipeline.rb:160:in `block in start'"], :thread=>"#<Thread:0x17e42f41 run>"}
[2019-08-19T23:49:28,779][ERROR][logstash.agent           ] Failed to execute action {:id=>:NSG, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<NSG>, action_result: false", :backtrace=>nil}

Here is the relevant pipeline config:

input {
	azureblob 
	{
		 storage_account_name => "azhkstorage"
		 storage_access_key => "******REDACTED*******"
		 container => "insights-logs-networksecuritygroupflowevent"
		 codec => "json"
		 # Refer https://docs.microsoft.com/azure/network-watcher/network-watcher-read-nsg-flow-logs
		 # Typical numbers could be 21/9 or 12/2 depends on the nsg log file types
		 file_head_bytes => 12
		 file_tail_bytes => 2
		 # Enable / tweak these settings when event is too big for codec to handle.
		 # break_json_down_policy => "with_head_tail"
		 # break_json_batch_count => 2
	}
}

filter {
	split { field => "[records]" }
	split { field => "[records][properties][flows]"}
	split { field => "[records][properties][flows][flows]"}
	split { field => "[records][properties][flows][flows][flowTuples]"}

	mutate{
		split => { "[records][resourceId]" => "/"}
		add_field => {"Subscription" => "%{[records][resourceId][2]}"
			"ResourceGroup" => "%{[records][resourceId][4]}"
			"NetworkSecurityGroup" => "%{[records][resourceId][8]}"}
		convert => {"Subscription" => "string"}
		convert => {"ResourceGroup" => "string"}
		convert => {"NetworkSecurityGroup" => "string"}
		split => { "[records][properties][flows][flows][flowTuples]" => ","}
		add_field => {
			"unixtimestamp" => "%{[records][properties][flows][flows][flowTuples][0]}"
			"srcIp" => "%{[records][properties][flows][flows][flowTuples][1]}"
			"destIp" => "%{[records][properties][flows][flows][flowTuples][2]}"
			"srcPort" => "%{[records][properties][flows][flows][flowTuples][3]}"
			"destPort" => "%{[records][properties][flows][flows][flowTuples][4]}"
			"protocol" => "%{[records][properties][flows][flows][flowTuples][5]}"
			"trafficflow" => "%{[records][properties][flows][flows][flowTuples][6]}"
			"trafficdecision" => "%{[records][properties][flows][flows][flowTuples][7]}"
		}
		convert => {"unixtimestamp" => "integer"}
		convert => {"srcPort" => "integer"}
		convert => {"destPort" => "integer"}      
	}

	date{
		match => ["unixtimestamp" , "UNIX"]
	}
 }


output {
	elasticsearch {
		hosts => "AZ-HK-ELK01"
		index => "nsg-flow-logs-%{+YYYY.MM.dd}"
	}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.