Failed to send event to Redis

I use the next logstash config:

input{
tcp {
port => "7101"
type => "228-pur-plan-tcp"
codec => "json_lines"
}
tcp {
host => "0.0.0.0"
port => "7102"
type => "228-pur-plan-tcp"
codec => "json_lines"
}
tcp {
host => "0.0.0.0"
port => "7103"
type => "228-pur-plan-tcp"
codec => "json_lines"
}
tcp {
host => "0.0.0.0"
port => "7104"
type => "228-pur-plan-tcp"
codec => "json_lines"
}
tcp {
host => "0.0.0.0"
port => "7105"
type => "228-pur-plan-tcp"
codec => "json_lines"
}
tcp {
host => "0.0.0.0"
port => "7106"
type => "228-pur-plan-tcp"
codec => "json_lines"
}
}

filter {
metrics {
meter => "tcp_events"
add_tag => "metrics"
}
}

output {
redis {
port => 22122
host => ["10.1.1.30"]
key => test
data_type => list
}
if "metrics" in [tags] {
file {
path => "/opt/lun1/out.metrics"
}
}
}

My logstash 5.2 version with next tuning:

pipeline.workers: 32
pipeline.output.workers: 32

and

-Xms24g
-Xmx24g

My hardware server use CentOS 6.8 and have:
80Gb memory
Intel(R) Xeon(R) CPU E5620 @ 2.40GHz
with 16 cpus.

After 10 minutes of work, I got the following error:
[WARN ][logstash.outputs.redis ] Failed to send event to Redis {:event=>2017-02-01T19:14:30.358Z 127.0.0.1 %{message}, :identity=>"default", :exception=>#<Redis::TimeoutError: Connection timed out>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:111:in_write_to_socket'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:105:in _write_to_socket'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:131:in write'", "org/jruby/RubyKernel.java:1479:in loop'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:130:in write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/connection/ruby.rb:374:in write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:271:in write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:250:in io'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:269:in write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:228:in process'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:222:in process'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:367:in ensure_connected'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:221:in process'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:306:in logging'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:220:in process'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis/client.rb:120:in call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis.rb:1070:in rpush'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis.rb:58:in synchronize'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/monitor.rb:211:in mon_synchronize'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/monitor.rb:210:in mon_synchronize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis.rb:58:in synchronize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/redis-3.3.2/lib/redis.rb:1069:in rpush'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-redis-3.0.3/lib/logstash/outputs/redis.rb:244:in send_to_redis'", "org/jruby/RubyProc.java:281:in call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-json-3.0.2/lib/logstash/codecs/json.rb:42:in encode'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-redis-3.0.3/lib/logstash/outputs/redis.rb:150:in receive'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:92:in multi_receive'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:92:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:19:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:43:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:336:in output_batch'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:335:in output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:293:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:263:in start_workers'"]}`

Redis::TimeoutError: Connection timed out

But if I use telnet I have connect to redis from logstash server without problem. And if I use redis-cli (from logstash server) to redis server (another server) also available.
Why logstash can't connect to redis?

If I understand correctly - this is a socket timeout, see https://github.com/redis/redis-rb/issues/361#issuecomment-25302287
Try to troubleshoot with guidance from the issue comment

And/or: set these configs
batch => true
batch_events => 2000

  # If true, we send an RPUSH every "batch_events" events or
  # "batch_timeout" seconds (whichever comes first).
  # Only supported for `data_type` is "list".
  config :batch, :validate => :boolean, :default => false

  # If batch is set to true, the number of events we queue up for an RPUSH.
  config :batch_events, :validate => :number, :default => 50
1 Like

Also 32 workers is too many - you will get the same performance with 16, one per core. Logstash really does hammer the CPUs.

Is redis running on the same box? Was Elasticsearch running on the same box too?

If so the 32 worker threads in Logstash will starve Redis/Elasticsearch.

I would set the workers to 8 or 10 to give Redis/Elasticsearch a chance of reasonable performance.
But remember Redis is single threaded event driven so it needs at least one core/thread for itself (ignoring OS thread scheduling)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.