Error Flushing Buffer at interval on Output plugin Elasticsearch


(Tat Dat Pham) #1

I using message queue RabbitMQ, i have 3 LS Indexer. (LS version 2.1.1 (8GB + 8Core), ES version 2.1.1 run with 3 Node (32GB RAM+16Core / node) )
I start 2 LS Indexer normal and LS-Indexer-3, run with --debug adn i got error :

undefined method close' for #<Manticore::Client:0x4d8af7a3> {:class=>"NoMethodError", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/http/manticore.rb:115:in__close_connections'", "org/jruby/RubyArray.java:1613:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/http/manticore.rb:115:in__close_connections'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/base.rb:99:in __rebuild_connections'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/base.rb:77:inreload_connections!'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:85:in sniff!'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:73:instart_sniffing!'", "org/jruby/ext/thread/Mutex.java:149:in synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:73:instart_sniffing!'", "org/jruby/RubyKernel.java:1479:in loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:72:instart_sniffing!'"], :level=>:error, :file=>"logstash/outputs/elasticsearch/http_client.rb", :line=>"89", :method=>"sniff!"}

And

Flushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x4e8204f1 @operations_mutex=#Mutex:0x4045e351, @max_size=500, @operations_lock=#Java::JavaUtilConcurrentLocks::ReentrantLock:0x447a6f81, @submit_proc=#Proc:0x5f03c038@/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.2.0-java/lib/logstash/outputs/elasticsearch/common.rb:55, @logger=#<Cabin::Channel:0x3033efe2 @metrics=#<Cabin::Metrics:0x27a29802 %metrics_lock=#Mutex:0x61571625, %metrics={}, @channel=#<Cabin::Channel:0x3033efe2 ...>>, %subscriber_lock=#Mutex:0x20940837, %level=:debug, $subscribers={12590=>#<Cabin::Outputs::IO:0x3ca6544c $io=#<IO:fd 1>, $lock=#Mutex:0x65198bc3>}, $data={}>, $last_flush=2016-01-16 15:47:47 +0700, @flush_interval=1, @stopping=#Concurrent::AtomicBoolean:0x654ce2e7, $buffer=[], $flush_thread=#<Thread:0x57f08f09 run>>", :interval=>1, :level=>:info, :file=>"logstash/outputs/elasticsearch/buffer.rb", :line=>"90", :method=>"interval_flush"}

And

connect timed out {:class=>"Manticore::ConnectTimeout", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:35:in initialize'", "org/jruby/RubyProc.java:281:incall'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:70:in call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:245:incall_once'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.4-java/lib/manticore/response.rb:148:in code'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.15/lib/elasticsearch/transport/transport/http/manticore.rb:71:inperform_request'", "org/jruby/RubyProc.java:281:in `call'",..............

And

Error: Your application used more memory than the safety cap of 500M.
Specify -J-Xmx####m to increase it (#### = cap size in MB).
Specify -w for full OutOfMemoryError stack trace

Here is my config

input {
    rabbitmq {
        host => "10.1.6.245"
        queue => "logstash-queue"
        key => "logstash-key"
        exchange => "logstash-rabbitmq"
        threads => 120
        exclusive => false
        prefetch_count => 512
        vhost => "ELK"
        port => 5677
        user => "logstash"
        password => "*****"
    }
}
output {
  elasticsearch {
    hosts => ["10.1.6.242:9200", "10.1.6.243:9200", "10.1.6.241:9200"]
    user => "*****"
    password => "****"
    sniffing => true
    manage_template => false
    #index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    #document_type => "%{[@metadata][type]}"
    index => "%{beatname}-%{+xxxx.ww}"
    document_type => "%{beattype}"
  }
}

And how to increase heap_size on logstash ( ussing centos 6.5)


(system) #2