Logstash pipeline getting blocked temporary

I have a logstash deployment which looks like below:-

Multiple Logstash-forwarder ----> Single Logstash Collector ----> Single Kafka Instance ----> Single Logstash Indexer ----> Elastic Search Cluster

The logstash Collector is getting stuck with the below errors in log:-

{:timestamp=>"2016-07-19T02:37:45.004000-0400", :message=>"Exception in lumberjack input thread", :exception=>#<LogStash::CircuitBreaker::OpenBreaker: for Lumberjack input>, :level=>:error}
{:timestamp=>"2016-07-19T02:37:45.005000-0400", :message=>"CircuitBreaker::Open", :name=>"Lumberjack input", :level=>:warn}
{:timestamp=>"2016-07-19T02:37:45.006000-0400", :message=>"Exception in lumberjack input thread", :exception=>#<LogStash::CircuitBreaker::OpenBreaker: for Lumberjack input>, :level=>:error}
{:timestamp=>"2016-07-19T02:37:45.092000-0400", :message=>"Lumberjack input: the pipeline is blocked, temporary refusing new connection.", :level=>:warn}
{:timestamp=>"2016-07-19T02:37:45.139000-0400", :message=>"CircuitBreaker::Open", :name=>"Lumberjack input", :level=>:warn}
{:timestamp=>"2016-07-19T02:37:45.140000-0400", :message=>"Exception in lumberjack input thread", :exception=>#<LogStash::CircuitBreaker::OpenBreaker: for Lumb

I have to restart the logstash-collector then it again works for some hours and then again it blocks. I am using logstash 1.5.3.

Can someone let me know what is going wrong and how can I prevent this?

I am starting logstash collector as below:-

/opt/logstash-1.5.3/bin/logstash agent -f /opt/logstash-1.5.3/conf/logstash_collector.conf -l /var/log/logstash/logstash.log -w 16

My logstash_collector.conf looks like below:-

input {
 lumberjack {
  port => 6782
  codec => json {}
  ssl_certificate => "/opt/logstash-1.5.3/cert/logstash-forwarder.crt"
  ssl_key => "/opt/logstash-1.5.3/cert/logstash-forwarder.key"
  type => "lumberjack"
 }
 lumberjack {
  port => 6783
  ssl_certificate => "/opt/logstash-1.5.3/cert/logstash-forwarder.crt"
  ssl_key => "/opt/logstash-1.5.3/cert/logstash-forwarder.key"
  type => "lumberjack"
 }
}

filter {
  if [env] != "prod" and [env] != "common" {
    drop {}
  }
  if [message] =~ /^\s*$/ {
    drop { }
  }
}

output {
  if "_jsonparsefailure" in [tags] {
    file {
      path => "/var/log/shop/parse_error/%{env}/%{app}/%{app}_%{host}_%{+YYYY-MM-dd}.log"
    }
  } else {
    kafka {
      broker_list => ["kafka:9092"]
      topic_id => "logstash_logs2"
    }
  }
}

LSF is deprecated, please use beats. :)[quote="Deb, post:1, topic:55829"]
logstash 1.5.3.
[/quote]

Please upgrade, this is super old.

Have you looked at ES to make sure it is ok?

We can not upgrade immediately. Can you let me know what could be the problem? Is the logstash overloaded? Can we tune any parameters in logstash that will help?

The flow is getting blocked in the logstash-collector itself. This is not even reaching the Kafka.