Logstash 1.5.4 Output to Elasticsearch with https

Hello together,

I am new to the elastic stack and have no further experience than just google a solution. In my environment I have to use Logstash on a remote server. I can not change it. I have no control over the version which is installed. Currently it is Logstash 1.5.4 .

I want to send my log data directly to elasticsearch. Therefore I have to use a https url. Based on the old logstash version (1.5.4), I have no idea how to do it.
Some weeks ago I was sending my data to a kafka server. This works for me, but now I have different requirments so I have to change the output area of my logstash config to send data to an elastic server.

This is my current logstash configuration:

output {
		elasticsearch {
			protocol => http
			host => "https://myelasticserver.com"
			port => "443"
			user => "myuser"
			password => mysecretpassword
			index => "classic-dev-%{+YYYY.MM.dd}"
		}
}

"host" should be my elasticserver address, which is secured with basic auth.

If I run this script with my actual paramters, I get the following error:
"Failed to install template: https: Name or service not known {:level=>:error}"

Thanks for your help in advance.

Logstash 1.5.4 is very old and not compatible with recent versions. I suspect you need to either upgrade or use a MQ like Kafka as an intermediate step.

1 Like

Thanks for your answer. Is there any chance to use https with the current version 1.5.4?

I do not know. Have not used that version in a long long time.

What is the elasticsearch version?

Hello, currently i am using

  "version" : {
    "number" : "7.9.3",
    ...

This is not going to work IMO.

You need to use the same major version (even better the same exact version). So if you want to use LS 1.5, use Elasticsearch 1.5 which I definitely do not recommend!

Your best chance IMO is to upgrade LS to 7.9.3.

Thanks for your help. I dont want to use LS 1.5, but I have no rigths to update it. So is it not possible with these different Versions of LS and ES to connect to eacht other directly (Without MQ like Kafka)?

That might work, but that's an ancient version and it might not work with your Kafka version.

In my case, I just dont know what I can do anymore. I know my LS version is deprecated but I have no influence on that and cannot change it. I think that LS does not connect to my "https" elasticsearch url. My current error is:

Failed to install template: https: Name or service not known {:level=>:error}

Further configurations with "ssl => ..." bring no added value.

As the bulk API has not changed in a long time I do not see why it would not work. Try with the following settings:

output {
		elasticsearch {
			protocol => "http"
            ssl => true
			host => "myelasticserver.com:443"
			user => "myuser"
			password => mysecretpassword
			index => "classic-dev-%{+YYYY.MM.dd}"
		}
}

Based on the docs for this old version it seems like the use of the host and port parameters have changed since then.

Thanks for your suggestion. I tried this in my environment. Now this error is shown:

Failed to install template: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target {:level=>:error}
Logstash startup completed
Got error to send bulk of actions: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target {:level=>:error}

Is there a configuration in the LS output area where I can define the certification path?

I do not know as I have not used that version in years.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.