mfisher
(Max)
September 25, 2019, 5:41pm
1
I recently used this guide https://www.elastic.co/blog/getting-started-with-elasticsearch-security to secure our ELK stack.
I am able to log in to Kibana and TSL is working correctly. However my syslogs from logstash have stopped adding to the indices in elasticsearch.
I checked the logs for the logstash service and saw multiple lines with this error.
[2019-09-25T12:24:43,390][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://localhost:9200/ ", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :error=>"Got response code '401' contacting Elasticsearch at URL 'http://localhost:9200/ '"}
my logstash-sample.conf looks like this:
input {
beats {
port => 5044
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200 "]
index => "%{[@metadata ][beat]}-%{[@metadata ][version]}-%{+YYYY.MM.dd}"
user => "logstash_system"
password => "PASS from bin/elasticsearch-setup-passwords auto"
}
}
If you have TLS enabled I suppose this should be https
and not http
?
mfisher
(Max)
September 25, 2019, 5:46pm
3
I thought that at first but I configured TLS a few days prior and things were running fine. Out of curiosity I changed it to https and it didn't resolve the issue.
The built in logstash_system
role does not have privileges to write to the indices. Have a look at the documentation and create a new user and role with the correct primileges.
mfisher
(Max)
September 25, 2019, 6:57pm
5
I created a user "logstash"_writer with the role "logstash_writer_role".
The role has cluster privs "manage_index_templates, monitor, manage_ilm" and index privs "write, create, delete, create_index, manage, manage_ilm" for all indices.
Then added those credentials to the logstash-sample.conf and still get the same error.
Also I was mistaken about TLS that was only configured between Kibana and the browser. So that shouldn't be an issue.
mfisher
(Max)
September 25, 2019, 10:25pm
6
I thought elasticsearch may be having an issue with the characters I used for the password so I changed the "logstash_writer" user password to "testing" and got this.
[2019-09-25T17:12:56,202][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://logstash_writer:xxxxxx@localhost:9200/ "}
however now there is a new error in the log
[2019-09-25T17:15:36,500][WARN ][logstash.outputs.elasticsearch] Overwriting supplied index logstash-asa with rollover alias cisco-asa
[2019-09-25T17:15:36,738][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError: LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:291:in `perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:278:in `block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:373:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:277:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:285:in `block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:341:in `exists?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:359:in `rollover_alias_exists?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/ilm.rb:91:in `maybe_create_rollover_alias'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/ilm.rb:10:in `setup_ilm'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:52:in `block in setup_after_successful_connection'"]}
[2019-09-25T17:15:36,847][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
ikakavas
(Ioannis Kakavas)
September 26, 2019, 8:00am
7
Please show us your current elasticsearch configuration and the elasticsearch output plugin configuration from logstash. If you have enabled TLS for the http layer there is no way this will ever work with
hosts => ["http://localhost:9200"]
mfisher
(Max)
September 26, 2019, 2:47pm
8
elasticsearch.yml
:/etc/elasticsearch# grep -v "#" elasticsearch.yml
path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch
xpack.security.enabled: true
output from logstash.conf file:
output {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => true
ilm_enabled => "auto"
ilm_rollover_alias => "cisco-asa"
ilm_pattern => "000001"
ilm_policy => "cisco_asa_rollover_policy"
index => "logstash-asa"
user => logstash_writer
password => testing
}
}
ikakavas
(Ioannis Kakavas)
September 26, 2019, 2:59pm
9
can you authenticate to Elastiscsearch with logstash_writer
and testing
? i.e. can you share the output of
curl -ulogstash_writer:testing http://localhost:9200/_security/_authenticate
and the error still remains above as you have shared it ? If so I will proceed to move this to the Logstash subforum in the hopes that you get some more targeted help
mfisher
(Max)
September 26, 2019, 4:47pm
10
output of "curl -ulogstash_writer:testing http://localhost:9200/_security/_authenticate "
{
"username" : "logstash_writer",
"roles" : [
"logstash_writer_role"
],
"full_name" : "Logstash Writer",
"email" : "",
"metadata" : { },
"enabled" : true,
"authentication_realm" : {
"name" : "default_native",
"type" : "native"
},
"lookup_realm" : {
"name" : "default_native",
"type" : "native"
}
}
I noticed in the logstash logs there was an entry referencing that ssl was required to be configured for security.
generated the .p12 with bin/elasticsearch-certutil cert -out config/elastic-certificates.p12 -pass ""
Added this to elasticsearch.yml:
xpack.security.transport.ssl.enabled: true
xpack.security.transport.ssl.verification_mode: certificate
xpack.security.transport.ssl.keystore.path: elastic-certificates.p12
xpack.security.transport.ssl.truststore.path: elastic-certificates.p12
now logstash logs show pipeline aborted due to this error :
exception=>#<Manticore::UnknownException: Unsupported or unrecognized SSL message >
I used bin/elasticsearch-certutil cert --pem to generate the cert for logstash and set the path in the .conf file
ssl => true
cacert => "/etc/logstash/config/logstash.crt"
mfisher
(Max)
September 26, 2019, 10:10pm
11
Could you close this thread, there were multiple configuration errors that I've found myself that have made the original issue moot.
Thanks for your help!
mfisher
(Max)
September 27, 2019, 6:25pm
12
It turns out this error was due to privs within my "logstash_writer_role" role.
I had added the following privs to the role
privs "manage_index_templates , monitor , manage_ilm " and index privs "write , create , delete , create_index , manage , manage_ilm " for all indices.
However whenever logstash tried to overwrite the index name with the rollover alias it encountered that error and killed the pipeline.
[2019-09-25T17:15:36,500][WARN ][logstash.outputs.elasticsearch] Overwriting supplied index logstash-asa with rollover alias cisco-asa
[2019-09-25T17:15:36,738][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError: LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:291:in `perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:278:in `block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:373:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:277:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:285:in `block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:341:in `exists?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:359:in `rollover_alias_exists?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/ilm.rb:91:in `maybe_create_rollover_alias'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/ilm.rb:10:in `setup_ilm'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:52:in `block in setup_after_successful_connection'"]}
[2019-09-25T17:15:36,847][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
Changing the role of the logstash_writer to superuser fixed the issue.
What role privs would allow logstash to fully manage the indices without hitting this error?
ikakavas
(Ioannis Kakavas)
September 29, 2019, 9:37am
13
Those should be enough. Can you please verify the privileges that logstash_writer_role
role has with
GET /_security/role/logstash_writer_role
?
mfisher
(Max)
September 30, 2019, 2:56pm
14
GET _security/roles/logstash_writer_role
Can't connect to _security:80 (Temporary failure in name resolution)
ikakavas
(Ioannis Kakavas)
September 30, 2019, 3:32pm
15
Apologies, I wasn't clear probably.
GET /_security/role/logstash_writer_role
should be run from the dev tools in Kibana. If this is not avaialble, you should use curl
curl -u elastic -X GET "localhost:9200/_security/role/logstash_writer_role?pretty"
mfisher
(Max)
September 30, 2019, 3:45pm
16
Gotcha heres the output:
"logstash_writer_role" : {
"cluster" : [
"manage_ilm",
"manage_index_templates",
"monitor"
],
"indices" : [
{
"names" : [
"cisco-ios-*",
"logstash-asa*",
"cisco-asa-*",
"apm-*"
],
"privileges" : [
"write",
"create",
"delete",
"create_index",
"manage",
"manage_ilm"
],
"allow_restricted_indices" : false
}
],
"applications" : [ ],
"run_as" : [ ],
"metadata" : { },
"transient_metadata" : {
"enabled" : true
}
}
}```
ikakavas
(Ioannis Kakavas)
October 1, 2019, 5:53am
17
Please don't post unformatted code, logs, or configuration as it's very hard to read.
Instead, paste the text and format it with </> icon or pairs of triple backticks (```), and check the preview window to make sure it's properly formatted before posting it. This makes it more likely that your question will receive a useful answer.
It would be great if you could update your post to solve this.
Your privilege list looks right. You previously mentioned that
but it looks like those are applied to only the ones below:
mfisher:
"names" : [
"cisco-ios- ",
"logstash-asa ",
"cisco-asa- ",
"apm- "
],
Are all the indices where logstash tries to write to covered by they above list ? Is this actually the indices list you saw or is this some formatting mishap ? You seem to be missing a few *
from your index patterns
mfisher
(Max)
October 1, 2019, 2:09pm
18
They all have *'s and those are all indices on the stack.
system
(system)
Closed
October 29, 2019, 2:09pm
19
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.