Hi Everyone. My elasticsearch self signed certificate is about to expire and I have More than thousands of logstash vms which I'm using for logstash data ingestion to elasticsearch and I'm using the elasticsearch credentials. It is not feasible to change the credentials in logstash configuration if I change the certificates. Please help me through it. What else can be done here.
You need to provide more details.
We don't know what your current configuration looks like, so we really aren't going to be able to offer you good advice about the steps you can take.
This is what my elasticsearch.yml looks like
http.cors.enabled: true
#discovery.zen.ping_timeout: 100s
#discovery.zen.fd.ping_timeout: 100s
#cluster.max_shards_per_node: 2000
node.master: false
node.data: true
xpack.security.enabled: true
xpack.security.transport.ssl.enabled: true
xpack.security.transport.ssl.verification_mode: certificate
xpack.security.transport.ssl.keystore.path: elastic-certificates.p12
xpack.security.transport.ssl.truststore.path: elastic-certificates.p12
You also need to share an example of your logstash output.
Are you going to change the CA as well?
output {
elasticsearch {
hosts => ["https://xyz.newyorktimes.in:443"]
index => "timesjobs-%{+YYYY.MM.dd}"
user => "logstash"
password => "${ES_PWD}"
ssl_certificate_verification => true
cacert => "/etc/logstash/conf.d/star.timesinternet.in.pem"
}
}
This is my logstash config to push data into ES. Can you please help me with any idea without changing CA if I can extend the certificate expiry or generate new one so i don't have to touch logstash config.
It really depends how you generate it, this is mostly unrelated to Elasticsearch or Logstash.
Is your CA expiring as well? If your CA is not expiring as well you can just generate new certificates to your Elasticsearch cluster and change them, it will not require any changes to logstash as you just specify the CA in logstash.
But if your CA is expiring as well, then there is not much you can do, you will need to change your configurations to use the new CA.
Or at least replace the CA file with the new one and restart the instances.
Below command I used in dev tools
GET _ssl/certificates
And this returned the below result
[
{
"path" : "elastic-certificates.p12",
"format" : "PKCS12",
"alias" : "instance",
"subject_dn" : "CN=Elastic Certificate Tool Autogenerated CA",
"serial_number" : "dccaddtfasgyd2ssjnhu3djjse104cb0fce547f",
"has_private_key" : false,
"expiry" : "2024-07-04T09:31:38.000Z"
},
{
"path" : "elastic-certificates.p12",
"format" : "PKCS12",
"alias" : "ca",
"subject_dn" : "CN=Elastic Certificate Tool Autogenerated CA",
"serial_number" : "dccaddtfasgyd2ssjnhu3djjse104cb0fce547f",
"has_private_key" : false,
"expiry" : "2024-07-04T09:31:38.000Z"
},
{
"path" : "elastic-certificates.p12",
"format" : "PKCS12",
"alias" : "instance",
"subject_dn" : "CN=instance",
"serial_number" : "dccaddtfasgyd2ssjnhu3djjse104cb0fce547f",
"has_private_key" : true,
"expiry" : "2024-07-04T09:31:39.000Z"
}
]
Your CA is expiring as well, you will need to change the CA file used in all logstash outputs you have, this file cacert => "/etc/logstash/conf.d/star.timesinternet.in.pem"
, there is no other way.
What you can done to have less impact is change all your outputs to not validate the certificate, this way you can change your certificate in Elasticsearch and Logstash will ignore the certificate so it will keep sending data,
Then you can change the certificate file in logstash and fix this settings.
Basically you're asking me to do below thing in my logstash output:
ssl_certificate_verification => false
But when I will generate another certificate I will again have to provide the new credentials or it can be without credentials ?
Yeah, this is a temporarily solution to try to minimize the impact as you need to replace the CA.
After you created the new CA and changed it in Elasticsearch you can then enable verification again in Logstash.
The thing is I can renew the below ssl but is there any way to extend my CA expiry.
/etc/logstash/conf.d/star.timesinternet.in.pem