Hello Team,
I have configured security for elasticsearch, but after it logstash is not able to connect elasticsearch.
Getting below error:
[2019-08-08T23:13:27,353][INFO ][org.apache.kafka.clients.consumer.internals.AbstractCoordinator] [Consumer clientId=logstash-0, groupId=logstash] (Re-)joining group
[2019-08-08T23:13:27,379][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-08-08T23:13:27,995][INFO ][org.apache.kafka.clients.consumer.internals.AbstractCoordinator] [Consumer clientId=logstash-0, groupId=logstash] Successfully joined group with generation 65
[2019-08-08T23:13:28,001][INFO ][org.apache.kafka.clients.consumer.internals.ConsumerCoordinator] [Consumer clientId=logstash-0, groupId=logstash] Setting newly assigned partitions []
[2019-08-08T23:13:31,194][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"https://logstash_internal:xxxxxx@10.95.122.113:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [https://logstash_internal:xxxxxx@10.95.122.113:9200/][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target"}
[2019-08-08T23:13:36,208][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"https://logstash_internal:xxxxxx@10.95.122.113:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [https://logstash_internal:xxxxxx@10.95.122.113:9200/][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target"}
Logstash config file:
output {
elasticsearch {
hosts => [ "https://elasticsearch_host:9200" ]
index => "logstash-%{+YYYY.MM.dd}"
user =>"logstash_internal"
password =>"some_password"
ssl_certificate_verification => false
}
}
Elasticsearch.yml:
xpack.security.transport.ssl.verification_mode:certificate
xpack.security.transport.ssl.truststore.path:certs/elastic-certificates.p12
xpack.security.transport.ssl.keystore.path:certs/elastic-certificates.p12
xpack.security.transport.ssl.enabled:true
xpack.security.http.ssl.truststore.path:certs/elastic-certificates.p12
xpack.security.http.ssl.keystore.path:certs/elastic-certificates.p12
xpack.security.http.ssl.enabled:true
xpack.security.http.ssl.client_authentication:optional
xpack.security.enabled:true
xpack.security.authc.run_as.enabled:true
xpack.security.authc.realms.pki1.type:pki
xpack.monitoring.history.duration:7d
xpack.monitoring.collection.enabled:true
node.name:${HOSTNAME}
node.master:true
node.data:false
indices.memory.index_buffer_size:40%
http.cors.enabled:true
http.cors.allow-origin:*
ES_JAVA_OPTS:-Xms512m -Xmx512m
discovery.zen.ping.unicast.hosts:es-master
discovery.zen.minimum_master_nodes:2
DISABLED_xpack.monitoring.exporters.id1.type:http
DISABLED_xpack.monitoring.exporters.id1.host:http://es-monitoring.es.svc.cluster.local:9200
cluster.routing.allocation.awareness.attributes:rack_id
cluster.name:rancher-es
Please suggest what wrong I'm doing here.
Regards
Akshay.