I am getting approx 5000 events per second and a lot of Grok, KV, DNS filters are being used.
My current memory is 16 GB, 8 Core thread per core is 2.
I have set JVM size to 8GB.
But My JVM is getting full and Logstash is getting shut down and many events are getting dropped.
This forum is manned by volunteers. There is therefore no guaranteed SLAs and nor guarantee you will get an answer. Please do not ping people who are not already involved in the thread. If you have not received any response after 2-3 days (excluding weekends) I would recommend you bump the issue, but please give people time to respond.
I do not have an answer to this question, so will leave it for others. It would probably help if you provided as much information as possible though, e.g. version used, settings and information about the pipeline(s)/configuration used.
@Christian_Dahlqvist I apologize for this behaviour
I am using 7.3 version of Logstash
I am having 3 Pipelines and I am receiving data from firewalls and Applications on UDP Port
I am receiving data on UDP Port for the firewalls and It is dropping the Packets
I am having 16 GB Memory 14 GB JVM
This is my input file Config
udp {
port => 5600
queue_size => 163840
workers => 8
codec => multiline {
pattern => "^\s"
what => "next"
}
type => "udp"
}
These are the Logs which i am getting
[2020-04-09T12:16:48,652][ERROR][logstash.agent ] Failed to execute action {:id=>:globalfilter, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Something is wrong with your configuration.", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:87:in `config_init'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:126:in `initialize'", "org/logstash/plugins/PluginFactoryExt.java:81:in `filter_delegator'", "org/logstash/plugins/PluginFactoryExt.java:251:in `plugin'", "org/logstash/execution/JavaBasePipelineExt.java:50:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/reload.rb:37:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:325:in `block in converge_state'"]}
[2020-04-09T12:40:04,281][ERROR][org.logstash.execution.ShutdownWatcherExt] The shutdown process appears to be stalled due to busy or blocked plugins. Check the logs for more information.
[2020-04-09T13:33:37,090][ERROR][org.logstash.execution.ShutdownWatcherExt] The shutdown process appears to be stalled due to busy or blocked plugins. Check the logs for more information.
[2020-04-09T13:33:55,346][ERROR][logstash.inputs.udp ] UDP listener died {:exception=>#<IOError: closed stream>, :backtrace=>["org/jruby/ext/socket/RubyUDPSocket.java:294:in `recvfrom_nonblock'", "org/jruby/ext/socket/RubyUDPSocket.java:286:in `recvfrom_nonblock'", "org/jruby/ext/socket/RubyUDPSocket.java:269:in `recvfrom_nonblock'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-udp-3.3.4/lib/logstash/inputs/udp.rb:125:in `block in udp_listener'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-udp-3.3.4/lib/logstash/inputs/udp.rb:123:in `udp_listener'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-udp-3.3.4/lib/logstash/inputs/udp.rb:68:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:309:in `inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:302:in `block in start_input'"]}
[2020-04-09T14:49:32,340][ERROR][org.logstash.execution.ShutdownWatcherExt] The shutdown process appears to be stalled due to busy or blocked plugins. Check the logs for more information.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.