Configure filebeat to use API keys

I have Elasticsearch, kibana, and filebeat (version 7.15.2 for all) running on a CentOS 8 machine using minimal security settings. I now want to get it working with TLS enabled.

I have Elasticsearch and kibana working with TLS enabled, but I am having trouble getting filebeat to connect.

I created an API key through the console with the following:

POST /_security/api_key
{
  "name": "filebeat_elk", 
  "role_descriptors": {
    "filebeat_writer": { 
      "cluster": ["monitor", "read_ilm", "read_pipeline", "manage_index_templates", "manage_ilm", "manage_ingest_pipelines"],
      "index": [
        {
          "names": ["filebeat-*"],
          "privileges": ["view_index_metadata", "create_doc", "create_index", "manage_ilm", "write"]
        }
      ]
    }
  }
}

In my filebeat.yml I updated the output.Elasticsearch

output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["localhost:9200"]

  # Protocol - either `http` (default) or `https`.
  #protocol: "https"

  # Authentication credentials - either API key or username/password.
  api_key: "<id>:<api-key>"
  #username: "filebeat"
  #password: ${ES_PWD}

Where id and api-key have the id and api-key returned by the console post.

Last few lines of /var/log/messages contains

Dec 13 22:58:26 ip-172-31-43-130 filebeat[50926]: 2021-12-13T22:58:26.411Z#011ERROR#011[publisher_pipeline_output]#011pipeline/output.go:154#011Failed to connect to backoff(elasticsearch(http://localhost:9200)): Get "http://localhost:9200": EOF
Dec 13 22:58:26 ip-172-31-43-130 filebeat[50926]: 2021-12-13T22:58:26.411Z#011INFO#011[publisher_pipeline_output]#011pipeline/output.go:145#011Attempting to reconnect to backoff(elasticsearch(http://localhost:9200)) with 31 reconnect attempt(s)
Dec 13 22:58:26 ip-172-31-43-130 filebeat[50926]: 2021-12-13T22:58:26.411Z#011INFO#011[publisher]#011pipeline/retry.go:219#011retryer: send unwait signal to consumer
Dec 13 22:58:26 ip-172-31-43-130 filebeat[50926]: 2021-12-13T22:58:26.411Z#011INFO#011[publisher]#011pipeline/retry.go:223#011  done
Dec 13 22:58:43 ip-172-31-43-130 filebeat[50926]: 2021-12-13T22:58:43.605Z#011INFO#011[monitoring]#011log/log.go:184#011Non-zero metrics in the last 30s#011{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":220,"time":{"ms":4}},"total":{"ticks":1140,"time":{"ms":11},"value":1140},"user":{"ticks":920,"time":{"ms":7}}},"handles":{"limit":{"hard":262144,"soft":1024},"open":12},"info":{"ephemeral_id":"97b37c95-3ce6-472e-9485-03b30db73009","uptime":{"ms":1230084},"version":"7.15.2"},"memstats":{"gc_next":29230544,"memory_alloc":17758280,"memory_total":159655784,"rss":125390848},"runtime":{"goroutines":42}},"filebeat":{"events":{"active":5,"added":5},"harvester":{"open_files":2,"running":2}},"libbeat":{"config":{"module":{"running":1},"scans":3},"output":{"events":{"active":0},"write":{"bytes":205}},"pipeline":{"clients":2,"events":{"active":675,"published":5,"retry":50,"total":5}}},"registrar":{"states":{"current":2}},"system":{"load":{"1":0.04,"15":0.08,"5":0.03,"norm":{"1":0.02,"15":0.04,"5":0.015}}}}}}
Dec 13 22:59:13 ip-172-31-43-130 filebeat[50926]: 2021-12-13T22:59:13.604Z#011INFO#011[monitoring]#011log/log.go:184#011Non-zero metrics in the last 30s#011{"monitoring": {"metrics": {"beat":{"cgroup":{"memory":{"mem":{"usage":{"bytes":4096}}}},"cpu":{"system":{"ticks":220,"time":{"ms":4}},"total":{"ticks":1160,"time":{"ms":22},"value":1160},"user":{"ticks":940,"time":{"ms":18}}},"handles":{"limit":{"hard":262144,"soft":1024},"open":12},"info":{"ephemeral_id":"97b37c95-3ce6-472e-9485-03b30db73009","uptime":{"ms":1260084},"version":"7.15.2"},"memstats":{"gc_next":29601920,"memory_alloc":15381536,"memory_total":160586560,"rss":125390848},"runtime":{"goroutines":42}},"filebeat":{"events":{"active":1,"added":1},"harvester":{"open_files":2,"running":2}},"libbeat":{"config":{"module":{"running":1},"scans":3},"output":{"events":{"active":0}},"pipeline":{"clients":2,"events":{"active":676,"published":1,"total":1}}},"registrar":{"states":{"current":2}},"system":{"load":{"1":0.02,"15":0.07,"5":0.02,"norm":{"1":0.01,"15":0.035,"5":0.01}}}}}}

I've gone over the documentation several times, and can't figure out what I'm missing.

Update, digging back further into the logs. I see

Dec 13 23:07:40 ip-172-31-43-130 filebeat[51168]: 2021-12-13T23:07:40.001Z#011ERROR#011[esclientleg]#011eslegclient/connection.go:220#011error connecting to Elasticsearch at https://localhost:9200: Get "https://localhost:9200": x509: certificate signed by unknown authority
Dec 13 23:07:40 ip-172-31-43-130 filebeat[51168]: 2021-12-13T23:07:40.001Z#011ERROR#011[modules]#011fileset/factory.go:158#011Error loading pipeline: Error creating Elasticsearch client: couldn't connect to any of the configured Elasticsearch hosts. Errors: [error connecting to Elasticsearch at https://localhost:9200: Get "https://localhost:9200": x509: certificate signed by unknown authority]

So looks like I have a new research direction figure out how to solve this.

It's because filebeat doesn't trust the certificate presented by Elasticsearch. You'll either need to disable ssl verification or copy the CA to the filebeat system and update the ssl config in the Elasticsearch output.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.