Multiple output to elasticsearch?

Hi All,

I would like to do multiple output to elasticsearch by logstash conf, but the following syntax of logstash conf seems incorrect

I check redis (redis-cli monitor) received two keys miki and miki2, and sent it to logstash center, but following logstash conf didn't write miki or miki2 to elasticsearch, am I missing something?

input {
        redis {
                host => "redis"
                port => 6379
                type => "redis-input"
                data_type => "list"
                key => "miki"
        }

       redis {
                host => "redis"
                port => 6379
                type => "redis-input"
                data_type => "list"
                key => "miki2"
       }
}

output {
        stdout { codec => rubydebug }
       if [key] == "miki" {
                elasticsearch {
                       cluster => "elasticsearch"
                        host => "elasticsearch"
                        codec => "json"
                        protocol => "http"
                        user => "es_admin"
                        password => "iiiiii"

                }
       }
        if [key] == "miki2" {
                elasticsearch {
                        host => "elasticsearch"
                        codec => "json"
                        protocol => "http"
                        user => "es_admin"
                        password => "iiiiii"
                        index => "franky"
                }
        }
}

Jason

Why does it seem incorrect?

Hi Mark,

because the logstash center doesn't write miki or miki2 to elasticsearch, if I comment some line, it can be work, the following content only writes one index

input {
        redis {
                host => "redis"
                port => 6379
                type => "redis-input"
                data_type => "list"
                key => "miki"
        }

#       redis {
#                host => "redis"
#                port => 6379
#                type => "redis-input"
#                data_type => "list"
#                key => "miki2"
#       }
}

output {
        stdout { codec => rubydebug }
#       if [key] == "miki" {
                elasticsearch {
#                       cluster => "elasticsearch"
                        host => "elasticsearch"
                        codec => "json"
                        protocol => "http"
                        user => "es_admin"
                        password => "iiiiii"

                }
#       }
#        if [key] == "miki2" {
#                elasticsearch {
#                        cluster => "elasticsearch"
#                        host => "elasticsearch"
#                        codec => "json"
#                        protocol => "http"
#                        user => "es_admin"
#                        password => "iiiiii"
#                       index => "franky"
#
#                }
#        }
}

That's still not clear.

Is it not creating a logstash-YYYY.MM.DD index for miki of the franky index for miki2?

Hi Mark,

Yes, it doesn't create any index as my previous logstash conf content post

It works fine as my 2nd post (comment someline)

$tail -f /var/log/logstash/logstash.log

{:timestamp=>"2015-08-18T13:55:23.817000+0800", :message=>"Failed to flush outgoing items", :outgoing_count=>21, :exception=>#<Manticore::SocketException: Connection refused>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.1-java/lib/manticore/response.rb:35:in `initialize'", "org/jruby/RubyProc.java:271:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.1-java/lib/manticore/response.rb:61:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.1-java/lib/manticore/response.rb:225:in `call_once'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.4.1-java/lib/manticore/response.rb:128:in `code'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.12/lib/elasticsearch/transport/transport/http/manticore.rb:71:in `perform_request'", "org/jruby/RubyProc.java:271:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.12/lib/elasticsearch/transport/transport/base.rb:190:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.12/lib/elasticsearch/transport/transport/http/manticore.rb:54:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.12/lib/elasticsearch/transport/client.rb:119:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.12/lib/elasticsearch/api/actions/bulk.rb:80:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.1-java/lib/logstash/outputs/elasticsearch/protocol.rb:103:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.1-java/lib/logstash/outputs/elasticsearch.rb:505:in `submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.1-java/lib/logstash/outputs/elasticsearch.rb:504:in `submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.1-java/lib/logstash/outputs/elasticsearch.rb:529:in `flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.1-java/lib/logstash/outputs/elasticsearch.rb:528:in `flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.20/lib/stud/buffer.rb:219:in `buffer_flush'", "org/jruby/RubyHash.java:1341:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.20/lib/stud/buffer.rb:216:in `buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.20/lib/stud/buffer.rb:112:in `buffer_initialize'", "org/jruby/RubyKernel.java:1511:in `loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.20/lib/stud/buffer.rb:110:in `buffer_initialize'"], :level=>:warn}
{:timestamp=>"2015-08-18T13:55:24.825000+0800", :message=>"Got error to send bulk of actions: Connection refused", :level=>:error}