Cant get Logstash to connect to ES ( localhost )


(Kyle Hartigan) #1

i am doing a brand new clean install of Logstash and Elasticsearch ( version 2.0) and i am having issues getting logstash to connect to the elasticsearch server on the same host, there is no firewall installed and

i can return elasticsearch server information using curl, any help/guidance would be greatly appreciated

these are the errors i get

:timestamp=>"2015-11-03T13:20:18.499000+1000", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://127.0.0.1:9200/\"]', but an error occurred and it failed! Are you sure you can reach elasticsearch from this machine using the configuration provided?", :client_config=>{:hosts=>["http://127.0.0.1:9200/"], :ssl=>nil, :transport_options=>{:socket_timeout=>0, :request_timeout=>0, :proxy=>nil, :ssl=>{}}, :transport_class=>Elasticsearch::Transport::Transport::HTTP::Manticore, :logger=>nil, :tracer=>nil, :reload_connections=>false, :retry_on_failure=>false, :reload_on_failure=>false, :randomize_hosts=>false}, :error_message=>"Failed to load class 'org.jruby.RubyObject$Access4JacksonDeserializer4c401575': com.fasterxml.jackson.module.afterburner.ser.BeanPropertyAccessor", :error_class=>"JrJackson::ParseError", :backtrace=>["com/jrjackson/JrJacksonBase.java:83:in `generate'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/jrjackson-0.3.6/lib/jrjackson/jrjackson.rb:59:in `dump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json/adapters/jr_jackson.rb:20:in `dump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json/adapter.rb:25:in `dump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json.rb:136:in `dump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.14/lib/elasticsearch/api/utils.rb:102:in `__bulkify'", "org/jruby/RubyArray.java:2414:in `map'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.14/lib/elasticsearch/api/utils.rb:102:in `__bulkify'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.14/lib/elasticsearch/api/actions/bulk.rb:82:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:56:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:353:in `submit'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:350:in `submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.2-java/lib/logstash/outputs/elasticsearch.rb:382:in `flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:219:in `buffer_flush'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:216:in `buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:193:in `buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:112:in `buffer_initialize'", "org/jruby/RubyKernel.java:1479:in `loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:110:in `buffer_initialize'"], :level=>:error}

here is a copy of my input and output configuration

02-input-logcourier.conf

input {
    courier {
        port            => 4400
        transport       => "tcp"
    }

    tcp {
        port => 4401
        type => "proxy"
    }
}

99-output.conf

output {
    if "_jsonparsefailure" in [tags] or "_grokparsefailure" in [tags] {
        file {
            path => "/var/log/logstash/failures.log"
        }
    } else if "_ignore" in [tags] {
        # Do nothing

    } else {
        if [type] == "proxy" {
           elasticsearch {
                hosts              => ["127.0.0.1:9200"]
                index              => "proxy-%{+YYYY.MM.dd}"
                template           => "/etc/logstash/es-templates/template-nginx-proxy.json"
                template_name      => "proxy"
                template_overwrite => true
            }
        } else {
            # not a type failure, but type still not supported
            file {
                path => "/var/log/logstash/failures.log"
            }
        }
    }
}

(Nikita) #2

I am experiencing the same error with Logstash 2.0.

Attempted to send a bulk request to Elasticsearch configured at ....:error_class=>"JrJackson::ParseError"

Even though it says ParseError, but this parsing used to work with Logstash 1.4.2..


(Kyle Hartigan) #3

Same my config worked fine on 1.4 and 1.5


(Nikita) #4

@Kyle_Hartigan, did it work fine for you with logstash 1.5.5? I just now installed 1.5.5, and am facing the same issue..


(Kyle Hartigan) #5

Hey nikita after some experimenting today i found the issue is actually
caused by my log-courier installation and the ticket that i have raised on
the logstash github repo. The also notified log courier devs aswell

I found by moving back to lumberjack fixed my issues and i am now running
elasticsearch 2.0 with logstash 2 and kibana 4.2


(Nikita) #6

Yeah, I got to know too that it's the courier plugin.. I was just wondering which was the version where courier plugin didn't cause this issue..


(Kyle Hartigan) #7

Here is the link to the repo and look at the issues section you will see
they are actively working on it


(Kyle Hartigan) #8

I think 1.4 is still working i have a friend who uses similar configs to me
and his is still working he is not keen to upgrade yet


(system) #9