Lumberjack input: Kafka output the pipeline is blocked, temporary refusing new connection

After we upgraded OS packages on the logstash server, logstash has stopped transmitting messages to Kafka. This is quite an old environment and uses the lumberjack plugin.

The architecture is somewhat like this

Logstash (entity1) ==> Logstash (entity2) ==> Kafka (entity2)

The Logstash (entity2) service keeps transmitting the following warning messages.

[2019-03-21T06:55:09,842][WARN ][logstash.inputs.lumberjack] Lumberjack input: the pipeline is blocked, temporary refusing new connection.
[2019-03-21T06:55:10,342][WARN ][logstash.inputs.lumberjack] Lumberjack input: the pipeline is blocked, temporary refusing new connection.
[2019-03-21T06:55:10,842][WARN ][logstash.inputs.lumberjack] Lumberjack input: the pipeline is blocked, temporary refusing new connection.


The logstash config looks somewhat like this

input {
lumberjack{
port => 5000
type => "logs"
codec => "json"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder-fqdn.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder-fqdn.key"
congestion_threshold => 30
}
}
filter {
ruby{
init => 'require "base64"'
code => 'event.set("[message]", Base64.decode64(event.get("[message]")))'
}
}
output {
stdout{
codec => line { format => "%{@timestamp} - %{[kafka][topic]}" }
}

if "XXX" in [kafka][topic] or "YYY" in [kafka][topic] {
kafka{
topic_id => "%{[kafka][topic]}"
codec => plain { format => "%{message}" }
bootstrap_servers => "XXXXXXX:9092"
metadata_max_age_ms => "1000"
retries => 60
retry_backoff_ms => 5000
acks => "all"
}
}

}

RPM packages

logstash-5.2.2-1.noarch
libselinux-ruby-2.0.94-7.el6.x86_64
ruby-libs-1.8.7.374-4.el6_6.x86_64
rubygems-1.3.7-5.el6.noarch
ruby-augeas-0.4.1-3.el6.x86_64
ruby-1.8.7.374-4.el6_6.x86_64
rubygem-json-1.5.5-3.el6.x86_64
ruby-libs-1.8.7.374-5.el6.x86_64
ruby-rdoc-1.8.7.374-5.el6.x86_64
ruby-irb-1.8.7.374-4.el6_6.x86_64
ruby-shadow-2.2.0-2.el6.x86_64

*** LOCAL GEMS ***

json (1.5.5)


  1. I have checked connectivity and everything looks fine.
  2. I tested changing the codec from json to plain in the input part and then the warning messages stop but then this does not help parse the messages for kafka, because of the logic we need.
  3. If I add congestion_threshold to 2200000 or such ridiculous value then the lumberjack warnings go away but no messages reach Kafka. I believe logstash is just spending time trying to process them.

Any help or advice to solve this problem is appreciated.

So the problem was that the logstash-kafka-output plugin had to be downgraded in order to be compatible with Kafka 0.9.0.0

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.