Logstash to logstash communication: Connection reset by peer error

I'm trying to setup logstash to logstash setup(veresion 6.1.3 on both), using lumberjack output and beats input as per guide.

My sender config output looks like (the sender input produces around 600 events per sec):

output {
  lumberjack {
    codec => json
    hosts => ["unixdev2"]
    port => 20105
    ssl_certificate => "/home/user/me/log-test/home/log/apps/current/logstash/config/indexer_0/lumberjack-new.crt"
  }
}

Receiver input:

input {
  beats {
    id => "lumberjack"
    port => 20105
    codec => "json"
    ssl_certificate => "/home/user/me/log-test/home/log/apps/current/logstash/config/indexer_0/lumberjack-new.crt"
    ssl_key => "/home/user/me/log-test/home/log/apps/current/logstash/config/indexer_0/lumberjack-new.key"
    ssl => true
    client_inactivity_timeout => 300
  }
}

This successfully sends some events but then prints error: Client write error, trying connect {:e=>#<IOError: Connection reset by peer>

[2020-02-18T14:13:10,252][ERROR][logstash.outputs.lumberjack] Client write error, trying connect {:e=>#<IOError: Connection reset by peer>, :backtrace=>["org/jruby/ext/openssl/SSLSocket.java:857:in `sysread'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/jruby-openssl-0.10.2-java/lib/jopenssl23/openssl/buffering.rb:57:in `fill_rbuff'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/jruby-openssl-0.10.2-java/lib/jopenssl23/openssl/buffering.rb:98:in `read'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/jls-lumberjack-0.0.26/lib/lumberjack/client.rb:157:in `read_version_and_type'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/jls-lumberjack-0.0.26/lib/lumberjack/client.rb:145:in `ack'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/jls-lumberjack-0.0.26/lib/lumberjack/client.rb:134:in `write_sync'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/jls-lumberjack-0.0.26/lib/lumberjack/client.rb:42:in `write'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-lumberjack-3.1.5/lib/logstash/outputs/lumberjack.rb:65:in `flush'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/buffer.rb:219:in `block in buffer_flush'", "org/jruby/RubyHash.java:1343:in `each'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/buffer.rb:216:in `buffer_flush'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/buffer.rb:159:in `buffer_receive'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-lumberjack-3.1.5/lib/logstash/outputs/lumberjack.rb:52:in `block in register'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/logstash-codec-json-3.0.5/lib/logstash/codecs/json.rb:42:in `encode'", "/home/foo/log_shipper/custom/logstash-6.1.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-lumberjack-3.1.5/lib/logstash/outputs/lumberjack.rb:59:in `receive'", "/home/foo/log_shipper/custom/logstash-6.1.3/logstash-core/lib/logstash/outputs/base.rb:92:in `block in multi_receive'", "org/jruby/RubyArray.java:1734:in `each'", "/home/foo/log_shipper/custom/logstash-6.1.3/logstash-core/lib/logstash/outputs/base.rb:92:in `multi_receive'", "/home/foo/log_shipper/custom/logstash-6.1.3/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:22:in `multi_receive'", "/home/foo/log_shipper/custom/logstash-6.1.3/logstash-core/lib/logstash/output_delegator.rb:50:in `multi_receive'", "/home/foo/log_shipper/custom/logstash-6.1.3/logstash-core/lib/logstash/pipeline.rb:487:in `block in output_batch'", "org/jruby/RubyHash.java:1343:in `each'", "/home/foo/log_shipper/custom/logstash-6.1.3/logstash-core/lib/logstash/pipeline.rb:486:in `output_batch'", "/home/foo/log_shipper/custom/logstash-6.1.3/logstash-core/lib/logstash/pipeline.rb:438:in `worker_loop'", "/home/foo/log_shipper/custom/logstash-6.1.3/logstash-core/lib/logstash/pipeline.rb:393:in `block in start_workers'"]}

the receiver, logs some events in debug mode and after a while logs(which seems like its sleeping):

[2020-02-19T00:43:14,098][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x79e261bd@/home/user/me/log-test/home/log/apps/current/logstash/logstash-core/lib/logstash/pipeline.rb:245 sleep>"}

But when I stop the sender, the shutdown watcher logs that, there are still pending logs: (indicates stalling_thread_info on the last filter I have)

[2020-02-18T14:16:11,932][WARN ][logstash.shutdownwatcher ] {"inflight_count"=>3000, "stalling_thread_info"=>{["LogStash::Filters::Mutate", {"add_field"=>{"[fields]...}, "id"=>"e072eba1288c48ef69d7704bdc9ce30f2ac85551f885fe47f11c5b04e8d54099"}]=>[{"thread_id"=>40, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>41, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>42, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>43, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>44, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>45, "name"=>nil, "current_call"=>"[...]/vendor/bundle/jruby/2.3.0/gems/jruby-openssl-0.10.2-java/lib/jopenssl23/openssl/buffering.rb:57:in `sysread'"}, {"thread_id"=>46, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>47, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>48, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>49, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>50, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>51, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>52, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>53, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>54, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>55, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>56, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>57, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>58, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>59, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>60, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>61, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>62, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}, {"thread_id"=>63, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:21:in `pop'"}]}}
[2020-02-18T14:16:11,933][ERROR][logstash.shutdownwatcher ] The shutdown process appears to be stalled due to busy or blocked plugins. Check the logs for more information.

Increasing client_inactivity_timeout seems to help little bit, but doesn't make the issue go away, what am I doing wrong here?

I can telnet to the host:

$ telnet unixdev2 20105
Trying 164.55.138.11...
Connected to unixdev2.
Escape character is '^]'.
Connection closed by foreign host.

But it closes itself in around 10 seconds, is this the reason sender is unable to connect for long time? or is it due to ssl communication not being present?

I have seen other questions on connection reset by peer but couldn't find any solution(most of them don't have any comments), so any help is greatly appreciated!

Any help?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.