Logstash 7.5.1 flood syslog with messages

After upgrade from 7.5 to 7.5.1 except that all messages which receive from filebeat nodes going to elastic, for some reason start bombing /var/log/syslog with the same information without even to have any of output-syslog plugin. Syslog file growing to fast just for minutes is 2-3GB.
In log4j.properties I comment console configuration but file still continue to fill itself:

rootLogger.level = ${sys:ls.log.level}
#rootLogger.appenderRef.console.ref = ${sys:ls.log.format}_console

What should to be the different, before upgrade I didn't have issues at all?
Configuration of one pipeline:

input {
  beats {
    port => 5047

filter {
  grok {
    patterns_dir => ["./patterns"]
    match => { "message" => "%{TIMESTAMP_ISO8601:log.timestamp} *%{LOGLEVEL:log.level} *%{JAVACLASS:log.class}" }

output {
  elasticsearch {
    hosts => ["https://********/"]
    user => "*****"
    password => "*****"
    index => "dev-testing"
 stdout { codec => rubydebug }

I have suspicious maybe some of those settings can provoke that action:

"slowlog.logstash.codecs.plain" : "TRACE",
"slowlog.logstash.codecs.rubydebug" : "TRACE",
"slowlog.logstash.filters.grok" : "TRACE",
"slowlog.logstash.filters.json" : "TRACE",
"slowlog.logstash.inputs.beats" : "TRACE",
"slowlog.logstash.outputs.elasticsearch" : "TRACE",
"slowlog.logstash.outputs.stdout" : "TRACE"

Same problem here, my issue was one of my conf.d having :
stdout { codec => rubydebug }

removed that and was happy

@charlesrg, yes that make sense, I forgot that stdout generally use is for plug-in debug :confused:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.