I've read a number of forum posts on this subject, but none seem to hold the answer for me. I have x-pack installed on elasticsearch and kibana, and I have them both running. I can start logstash and see it appear in the monitoring screen, however in the console I get the message
[ 2018-06-14T14:48:09,014][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance,
but got an error. {:url=>"http://localhost:9200/", :error_type=>LogStash::Outputs::Elasticsearch::HttpClient::Pool::BadR
esponseCodeError, :error=>"Got response code '401' contacting Elasticsearch at URL 'http://localhost:9200/'"}
I'm using the logstash_system account
I've added these lines to my logstash.yml
That said (this is not the cause of the problem with 401 responses, but it will cause your next problem) you shouldn't be using logstash_system in your conf file. Per the docs you need to create a custom user for this purpose. logstash_system will not work in logstash pipeline:
X-Pack security comes preconfigured with a logstash_system user ... This user has the minimum permissions necessary for the monitoring function, and should not be used for any other purpose - it is specifically not intended for use within a Logstash pipeline.
I'm using version 6.2.4
And though changeme isn't exactly the password I'm using, I'm able to access the user screen of kibana and set the passwords to exactly what is in my config
After changing logstash to a user with superuser permissions I am getting this error
[2018-06-15T10:11:22,074][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance,
but got an error. {:url=>"http://localhost:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadR
esponseCodeError, :error=>"Got response code '401' contacting Elasticsearch at URL 'http://localhost:9200/'"}
[2018-06-15T10:11:27,147][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connect
ion is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-06-15T10:11:27,147][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance,
but got an error. {:url=>"http://localhost:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadR
esponseCodeError, :error=>"Got response code '401' contacting Elasticsearch at URL 'http://localhost:9200/'"}
[2018-06-15T10:11:31,737][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<NoMethodError: unde
fined method `<' for nil:NilClass>, :backtrace=>["D:/LogAnalysis/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output
-elasticsearch-9.1.1-java/lib/logstash/outputs/elasticsearch/common.rb:222:in `get_event_type'", "D:/LogAnalysis/logstas
h/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-java/lib/logstash/outputs/elasticsearch/common.rb:4
7:in `event_action_tuple'", "D:/LogAnalysis/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-
java/lib/logstash/outputs/elasticsearch/common.rb:36:in `block in multi_receive'", "org/jruby/RubyArray.java:2486:in `ma
p'", "D:/LogAnalysis/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-java/lib/logstash/outpu
ts/elasticsearch/common.rb:36:in `multi_receive'", "D:/LogAnalysis/logstash/logstash-core/lib/logstash/output_delegator_
strategies/shared.rb:13:in `multi_receive'", "D:/LogAnalysis/logstash/logstash-core/lib/logstash/output_delegator.rb:49:
in `multi_receive'", "D:/LogAnalysis/logstash/logstash-core/lib/logstash/pipeline.rb:477:in `block in output_batch'", "o
rg/jruby/RubyHash.java:1343:in `each'", "D:/LogAnalysis/logstash/logstash-core/lib/logstash/pipeline.rb:476:in `output_b
atch'", "D:/LogAnalysis/logstash/logstash-core/lib/logstash/pipeline.rb:428:in `worker_loop'", "D:/LogAnalysis/logstash/
logstash-core/lib/logstash/pipeline.rb:386:in `block in start_workers'"]}
[2018-06-15T10:11:31,987][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: org.jruby.exceptions.RaiseE
xception: (SystemExit) exit
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.