Unable to retrieve license information from license server

Setting up a multi-node ELK 8.2 stack and hitting an issue I've not been able to sort out. The basics of Elasticsearch and Kibana are working, but Logstash doesn't want to connect to the one Elasticsearch machine I have. There are three nodes on the same network: Elasticsearch, Kibana and Logstash.

May 16 00:31:56 ls1.domain.com logstash[343480]: [2022-05-16T00:31:56,482][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"Got response code '401' contacting Elasticsearch at URL 'https://10.1.1.1:9200/_xpack'"}
May 16 00:31:56 ls1.domain.com logstash[343480]: [2022-05-16T00:31:56,501][ERROR][logstash.monitoring.internalpipelinesource] Failed to fetch X-Pack information from Elasticsearch. This is likely due to failure to reach a live Elasticsearch cluster.

However, if I test using curl from the same machine it works fine:

me@ls1:~$ curl -k -X GET -H "Authorization: ApiKey $API_KEY" "https://10.1.1.1:9200/_xpack"
{"build":{"hash":"********","date":"2022-04-20T10:35:10.180408517Z"},"license":{"uid":"******","type":"basic","mode":"basic","status":"active"}
,"features":{"aggregate_metric":{"available":true,"enabled":true},"analytics":{"available":true,"enabled":true},"ccr":{"available":false,"enabled":true},"data_streams":{"available":true,"enabled":true},"da
ta_tiers":{"available":true,"enabled":true},"enrich":{"available":true,"enabled":true},"eql":{"available":true,"enabled":true},"frozen_indices":{"available":true,"enabled":true},"graph":{"available":false,
"enabled":true},"ilm":{"available":true,"enabled":true},"logstash":{"available":false,"enabled":true},"ml":{"available":false,"enabled":true},"monitoring":{"available":true,"enabled":true},"rollup":{"avail
able":true,"enabled":true},"searchable_snapshots":{"available":false,"enabled":true},"security":{"available":true,"enabled":true},"slm":{"available":true,"enabled":true},"spatial":{"available":true,"enable
d":true},"sql":{"available":true,"enabled":true},"transform":{"available":true,"enabled":true},"voting_only":{"available":true,"enabled":true},"watcher":{"available":false,"enabled":true}},"tagline":"You k
now, for X"}

So I guess this rules out the api_key in use, connectivity and the configuration of Elasticsearch. I've disabled all config under /etc/logstash/conf.d, I'm just trying to get logstash monitoring working. logstash.yml is default apart for:

node.name ls1
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.hosts: ["https://10.1.1.1:9200"]
xpack.monitoring.elasticsearch.api_key: "******=="
xpack.monitoring.elasticsearch.ssl.certificate_authority: "/etc/logstash/certs/http_ca.crt"

What am I missing? Most other articles use basic auth, or struggle with SSL. There are no errors in the Elasticsearch logs.

Hi @dmgeurts Welcome to the community.

Can you provide your actual logstash pipeline config so we can see, especially the output section?

Are there any other logs during the connection about failing to connect, "can not establish connection..." etc

Also trying to debug through the legacy monitoring settings in your logstash.yml is probably not as helpful can you just comment that all out for the time being.

1 Like

Is this a single or multiple node cluster?

Sorry for the slow response. I don't think I have a pipeline yet, I thought I'd get the monitoring of logstash working first. Your suggestion appears to be to ignore logging for now and just focus on the writer pipe from logstash into Elasticsearch?

One error I found was that I didn't list the only the <api_key>. Once I set the right string as :<api_key> I now get Authentication using apikey failed - invalid credentials in the Elasticsearch logs. However I created the api_key with:

POST /_security/api_key
{
  "name": "logstash_els00", 
  "role_descriptors": {
    "logstash_monitoring": { 
      "cluster": ["monitor"],
      "index": [
        {
          "names": [".monitoring-ls-*"],
          "privileges": ["create_index", "create"]
        }
      ]
    }
  }
}

As per: https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-create-api-key.html

Multinode

So, time to fess up... I made two simple but big mistakes with the api_key:

  1. I failed to include the "id"
  2. I used the encoded value rather than the "api_key"

In hindsight, I'm proper kicking myself for not seeing this earlier. But I'm there now with working api_keys. So the format to use in config files is "<id>:<api_key>". Don't use the "encoded" string as I did, and don't forget to add the ID either...!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.