Every now and then my logstash crashes randomly. having a look at the logs, it seems to be related to a failed DNS lookup.
{:timestamp=>"2016-07-11T20:30:04.574000+0200", :message=>"Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.", "exception"=>#<ConcurrencyError: interrupted waiting for mutex: null>, "backtrace"=>["org/jruby/ext/thread/Mutex.java:94:inlock'", "org/jruby/ext/thread/Mutex.java:147:in synchronize'", "/opt/logstash/vendor/jruby/lib/ruby/1.9/resolv.rb:190:inlazy_initialize'", "/opt/logstash/vendor/jruby/lib/ruby/1.9/resolv.rb:268:in each_name'", "/opt/logstash/vendor/jruby/lib/ruby/1.9/resolv.rb:151:ineach_name'", "org/jruby/RubyArray.java:1613:in each'", "/opt/logstash/vendor/jruby/lib/ruby/1.9/resolv.rb:150:ineach_name'", "/opt/logstash/vendor/jruby/lib/ruby/1.9/resolv.rb:132:in getname'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-dns-2.1.3/lib/logstash/filters/dns.rb:244:ingetname'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-dns-2.1.3/lib/logstash/filters/dns.rb:231:in retriable_getname'", "org/jruby/RubyProc.java:281:incall'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-dns-2.1.3/lib/logstash/filters/dns.rb:216:in retriable_request'", "org/jruby/ext/timeout/Timeout.java:115:intimeout'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-dns-2.1.3/lib/logstash/filters/dns.rb:215:in retriable_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-dns-2.1.3/lib/logstash/filters/dns.rb:230:inretriable_getname'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-dns-2.1.3/lib/logstash/filters/dns.rb:178:in reverse'", "org/jruby/RubyArray.java:1613:ineach'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-dns-2.1.3/lib/logstash/filters/dns.rb:156:in reverse'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-dns-2.1.3/lib/logstash/filters/dns.rb:95:infilter'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/filters/base.rb:151:in multi_filter'", "org/jruby/RubyArray.java:1613:ineach'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/filters/base.rb:148:in multi_filter'", "(eval):457:ininitialize'", "org/jruby/RubyArray.java:1613:in each'", "(eval):452:ininitialize'", "org/jruby/RubyProc.java:281:in call'", "(eval):319:infilter_func'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:267:in filter_batch'", "org/jruby/RubyArray.java:1613:ineach'", "org/jruby/RubyEnumerable.java:852:in inject'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:265:infilter_batch'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:223:in worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:201:instart_workers'"], :level=>:error}`
This is what I have in my config:
dns {
reverse => [ "host" ]
action => "replace"
}
I don't get why logstash crashes after a failed dns lookup. Wouldn't it be better to add some sort of failure tag, as suggested here https://github.com/logstash-plugins/logstash-filter-dns/issues/24 ?
Should I create an issue on the mentioned github page about this?