<Redis::TimeoutError: Connection timed out> When trying send data to Redis from Logstash

Hello,
We are facing the following error when trying to send data to redis.

[2017-04-11T13:36:02,010][WARN ][logstash.outputs.redis ] Failed to flush outgoing items {:outgoing_count=>50, :exception=>"Redis::TimeoutError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:111:in `_write_to_socket'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:105:in `_write_to_socket'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:131:in `write'", "org/jruby/RubyKernel.java:1479:in `loop'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:130:in `write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:374:in `write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:271:in `write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:250:in `io'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:269:in `write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:228:in `process'", "org/jruby/RubyArray.java:1613:in `each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:222:in `process'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:367:in `ensure_connected'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:221:in `process'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:306:in `logging'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:220:in `process'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:120:in `call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis.rb:1070:in `rpush'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis.rb:58:in `synchronize'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/monitor.rb:211:in `mon_synchronize'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/monitor.rb:210:in `mon_synchronize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis.rb:58:in `synchronize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis.rb:1069:in `rpush'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-redis-]}

[2017-04-11T13:36:02,010][WARN ][logstash.outputs.redis ] Failed to send backlog of events to Redis {:identity=>"default", :exception=>#<Redis::TimeoutError: Connection timed out>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:111:in _write_to_socket'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:105:in _write_to_socket'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:131:in write'", "org/jruby/RubyKernel.java:1479:in loop'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:130:in write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:374:in write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:271:in write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:250:in io'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:269:in write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:228:in process'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:222:in process'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:367:in ensure_connected'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:221:in process'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:306:in logging'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:220:in process'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:120:in call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis.rb:1070:in rpush'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis.rb:58:in synchronize'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/monitor.rb:211:in mon_synchronize'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/monitor.rb:210:in mon_synchronize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis.rb:58:in synchronize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis.rb:1069:in rpush'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-redis-3.0.3/lib/logstash/outputs/redis.rb:179:in flush'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:221:in buffer_flush'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:216:in buffer_flush'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:193:in buffer_flush'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:159:in buffer_receive'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-redis-3.0.3/lib/logstash/outputs/redis.rb:236:in send_to_redis'"]}

Below is the configuration of logstash:

Input Config:
input {
redis {
data_type => list
key => filebeat
batch_count => 128
host => "${REDIS_HOST}"
db => "${REDIS_DATABASE}"
}
}

Output Config:
output {
redis {
host => "${REDIS_HOST}"
db => "1"
data_type => list
key => 'filebeat_lowPriority'
batch => true
#batch_events => 100
batch_timeout => 10
timeout => 10
reconnect_interval => 30
}
if ( [type] == "jetty" ) {
redis {
host => "${REDIS_HOST}"
db => "1"
data_type => list
key => 'filebeat_highPriority'
batch => true
#batch_events => 100
batch_timeout => 10
timeout => 10
reconnect_interval => 30
}
}
}

I tried with changing the timeout values in the output config but i still get the above error.
Please take a look and let me know if there are any improvements needed to the config to make this work.
Thanks in advance.

Hi Can someone please take a look at this issue.
Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.