Failing with multi-file config and auto reload flag

Hi,

i have an issue running in multi-files configuration (using folder with *.conf files) and auto-reload flag (--auto-reload)
this causes a crash and shutdown of log stash when i am sending the syslog event for parsing.

i am seeing error message

"NoMethodError: undefined method `multi_filter' for nil:NilClass"

please advise if any one having this issue.

Providing more details would be helpful!

What version are you on? What about java? What OS? Can you provide the full error?

tried also with the -r flag, same issue

details:

error:

Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>#<NoMethodError: undefined method multi_filter' for nil:NilClass>, "backtrace"=>["(eval):1026:incond_func_29'", "org/jruby/RubyArray.java:1613:in each'", "(eval):1023:incond_func_29'", "(eval):508:in filter_func'", "/auto/apollo_users/ssapir/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:271:infilter_batch'", "org/jruby/RubyArray.java:1613:in each'", "org/jruby/RubyEnumerable.java:852:ininject'", "/auto/apollo_users/ssapir/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:269:in filter_batch'", "/auto/apollo_users/ssapir/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:227:inworker_loop'", "/auto/apollo_users/ssapir/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:205:in start_workers'"], :level=>:error} NoMethodError: undefined methodmulti_filter' for nil:NilClass
cond_func_29 at (eval):1026
each at org/jruby/RubyArray.java:1613
cond_func_29 at (eval):1023
filter_func at (eval):508
filter_batch at /auto/apollo_users/ssapir/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:271
each at org/jruby/RubyArray.java:1613
inject at org/jruby/RubyEnumerable.java:852
filter_batch at /tmp/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:269
worker_loop at /tmp/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:227
start_workers at /tmp/logstash-2.3.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.0-java/lib/logstash/pipeline.rb:205

java:

java version "1.8.0_31"
Java(TM) SE Runtime Environment (build 1.8.0_31-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.31-b07, mixed mode)

OS:

Linux 3.10.0-327.10.1.el7.x86_64 x86_64

Ok so what do your filters look like?

about 5 files, one defines input and output, 4 other filters like grok, kv, prune, mutate.

it works perfectly without the auto reload flag, i worked with them on previous version (2.2.2) without any issue.

If you can't provide them then we cannot replicate, which makes it very hard to help solve.

input
{
tcp {
port => 1465
type => syslog
add_field => {"plugin" => 1}
}
tcp {
port => 1466
type => syslog
add_field => {"plugin" => 2}
}
tcp {
port => 1467
type => syslog
add_field => {"plugin" => 3}
}
tcp {#for testing
port => 1469
}
}

filter
{
mutate
{
remove_field => "port"
convert => [ "plugin", "integer" ]
rename => {"message" => "received_message"}
}
}

output
{
#stdout {}

if [parse_status] == "failed" 
  {
      elasticsearch 
    {         
        index => "failed" 
        document_type => "log"
        hosts => ["10.8.120.28"] 
    }
  }
  else
  {
    elasticsearch 
    {         
        index => "events" 
        document_type => "log"
        hosts => ["10.8.120.28"] 
        template => "/tmp/logstash-2.2.2/bin/elasticsearch-template.json"
           template_name => "events"
        manage_template => "true"
        template_overwrite => "true"
    }
  }

}

input{

}

filter{
if [plugin] == 1
{
syslog_pri
{
syslog_pri_field_name => "syslog5424_pri"
}

grok
{
  match => {"received_message" => "%{SYSLOG5424PRI}%{SYSLOGTIMESTAMP}%{GREEDYDATA}%{WORD:action}%{SPACE}%{IP} %{NOTSPACE} %{GREEDYDATA:kv_data}"}
}

kv 
{ 
  source => "kv_data"    
  #trimkey => "\s"  
  #field_split => "?;?"
  #value_split => ":"
}

mutate
{
  remove_field => "kv_data"
  rename =>{"src" => "source_ip"}
  rename =>{"dst" => "target_ip"}
}

}
}
output{

}

input{

}

filter{
if [plugin] == 2
{
syslog_pri
{
syslog_pri_field_name => "syslog5424_pri"
}
grok
{
match => {"received_message" => "%{SYSLOG5424PRI}%{SYSLOGTIMESTAMP} %{NOTSPACE}%{SPACE}%{GREEDYDATA:kv_data}"}
}

kv 
{ 
  source => "kv_data"    
  value_split => ":"    
}

mutate
{
  remove_field => "kv_data"
}

grok
{
  match => {"alert_message" => "%{GREEDYDATA}source address %{IP:source_ip}%{GREEDYDATA}destination address %{IP:target_ip}%{GREEDYDATA}"}
  tag_on_failure => [] #dont tag if failed, not all messages have this.
}

}
}
output{

}

input{

}

filter{
if [plugin] == 3
{
syslog_pri
{
syslog_pri_field_name => "syslog5424_pri"
}

grok
{
  match => {"received_message" => "%{SYSLOG5424PRI}%{SYSLOGTIMESTAMP} %{WORD:product}: %{DATE:date} %{TIME:time} %{WORD:severity} %{WORD:radware_id} %{NOTSPACE:catagory} \"%{DATA:attack_name}\" %{NOTSPACE:protocol} %{IP:source_ip} %{NUMBER:source_port} %{IP:target_ip} %{NUMBER:target_port} %{NUMBER:physical_port} %{WORD:context} \"%{DATA:policy_name}\" %{WORD:attack_status} %{WORD:packet_count} %{NOTSPACE:bandwidth} %{NOTSPACE:vlan_tag} %{NOTSPACE:mpls_rd} %{NOTSPACE:mpls_tag} %{NOTSPACE:risk} %{NOTSPACE:action} %{NOTSPACE:unique_id}"}
}   

}
}
output{

}

input{

}

filter{

if "_grokparsefailure" in [tags]
{
prune {
add_field => {"parse_status" => "failed"}
whitelist_names => ["plugin", "received_message", "host", "@timestamp", "tags"]
}
}
else
{
mutate
{
add_field => {"parse_status" => "success"}
remove_field => "received_message"
}

geoip 
{
  source => "source_ip"
  target => "source_geoip"
  database => "/tmp/logstash-2.2.2/bin/GeoLocation_City.dat"
}

geoip 
{
  source => "target_ip"
  target => "target_geoip"
  database => "/tmp/logstash-2.2.2/bin/GeoLocation_City.dat"
}

}
}

output{

}

Just a quick tip, you don't need so many empty input and output stanzas.

1 Like