Hello,
I have a single node deploy of Elasticsearch Enterprise Search and Kibana all on version 8.6.2. In this node I enabled self-monitoring with xpack, but am trying to switch to using Metricbeat.
While in 'Stack monitoring' > Clusters > elasticsearch > Nodes - 'setup mode' in the UI I followed those quick start steps but am encountering errors trying to start metricbeat.
Below are some relevant configs
elasticsearch.yml
path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch
xpack.security.enabled: true
xpack.security.enrollment.enabled: true
xpack.security.http.ssl:
enabled: true
keystore.path: certs/http.p12
xpack.security.transport.ssl:
enabled: true
verification_mode: certificate
keystore.path: certs/transport.p12
truststore.path: certs/transport.p12
cluster.initial_master_nodes: ["elastic"]
http.host: 0.0.0.0
metricbeat export config
metricbeat:
config:
modules:
path: /etc/metricbeat/modules.d/*.yml
reload:
enabled: false
output:
elasticsearch:
hosts:
- http://localhost:9200
password: [redacted]
username: elastic
path:
config: /etc/metricbeat
data: /var/lib/metricbeat
home: /usr/share/metricbeat
logs: /var/log/metricbeat
processors:
- add_host_metadata: null
setup:
template:
settings:
index:
codec: best_compression
number_of_shards: 1
ls -l /etc/metricbeat/modules.d/ | grep elasticsearch
-rw-r--r--. 1 root root 295 May 16 07:23 elasticsearch.yml
I moved the elasticsearch-xpack.yml file in attempts to try and potentially quiet any noise, but to no avail.
These are the errors and it does not populate in the Kibana monitoring UI:
journalctl -u metricbeat -f
-- Logs begin at Mon 2023-05-15 12:41:42 CDT. --
May 16 08:04:10 elastic metricbeat[23846]: {"log.level":"error","@timestamp":"2023-05-16T08:04:10.955-0500","log.origin":{"file.name":"module/wrapper.go","file.line":256},"message":"Error fetching data for metricset elasticsearch.node: error making http request: Get \"http://localhost:9200/_nodes/_local\": EOF","service.name":"metricbeat","ecs.version":"1.6.0"}
May 16 08:04:13 elastic metricbeat[23846]: {"log.level":"error","@timestamp":"2023-05-16T08:04:13.552-0500","log.logger":"publisher_pipeline_output","log.origin":{"file.name":"pipeline/client_worker.go","file.line":150},"message":"Failed to connect to backoff(elasticsearch(http://localhost:9200)): Get \"http://localhost:9200\": EOF","service.name":"metricbeat","ecs.version":"1.6.0"}
May 16 08:04:13 elastic metricbeat[23846]: {"log.level":"info","@timestamp":"2023-05-16T08:04:13.552-0500","log.logger":"publisher_pipeline_output","log.origin":{"file.name":"pipeline/client_worker.go","file.line":141},"message":"Attempting to reconnect to backoff(elasticsearch(http://localhost:9200)) with 3 reconnect attempt(s)","service.name":"metricbeat","ecs.version":"1.6.0"}
May 16 08:04:20 elastic metricbeat[23846]: {"log.level":"error","@timestamp":"2023-05-16T08:04:20.954-0500","log.origin":{"file.name":"module/wrapper.go","file.line":256},"message":"Error fetching data for metricset elasticsearch.node_stats: error making http request: Get \"http://localhost:9200/_nodes/_local/stats\": EOF","service.name":"metricbeat","ecs.version":"1.6.0"}
May 16 08:04:20 elastic metricbeat[23846]: {"log.level":"error","@timestamp":"2023-05-16T08:04:20.954-0500","log.origin":{"file.name":"module/wrapper.go","file.line":256},"message":"Error fetching data for metricset elasticsearch.node: error making http request: Get \"http://localhost:9200/_nodes/_local\": EOF","service.name":"metricbeat","ecs.version":"1.6.0"}
May 16 08:04:27 elastic metricbeat[23846]: {"log.level":"error","@timestamp":"2023-05-16T08:04:27.923-0500","log.logger":"publisher_pipeline_output","log.origin":{"file.name":"pipeline/client_worker.go","file.line":150},"message":"Failed to connect to backoff(elasticsearch(http://localhost:9200)): Get \"http://localhost:9200\": EOF","service.name":"metricbeat","ecs.version":"1.6.0"}
May 16 08:04:27 elastic metricbeat[23846]: {"log.level":"info","@timestamp":"2023-05-16T08:04:27.923-0500","log.logger":"publisher_pipeline_output","log.origin":{"file.name":"pipeline/client_worker.go","file.line":141},"message":"Attempting to reconnect to backoff(elasticsearch(http://localhost:9200)) with 4 reconnect attempt(s)","service.name":"metricbeat","ecs.version":"1.6.0"}
May 16 08:04:30 elastic metricbeat[23846]: {"log.level":"info","@timestamp":"2023-05-16T08:04:30.936-0500","log.logger":"monitoring","log.origin":{"file.name":"log/log.go","file.line":187},"message":"Non-zero metrics in the last 30s","service.name":"metricbeat","monitoring":{"metrics":{"beat":{"cgroup":{"cpu":{"cfs":{"period":{"us":100000}},"id":"metricbeat.service"},"cpuacct":{"id":"metricbeat.service","total":{"ns":290893188}},"memory":{"id":"metricbeat.service","mem":{"limit":{"bytes":9223372036854771712},"usage":{"bytes":46444544}}}},"cpu":{"system":{"ticks":130,"time":{"ms":130}},"total":{"ticks":280,"time":{"ms":280},"value":280},"user":{"ticks":150,"time":{"ms":150}}},"handles":{"limit":{"hard":262144,"soft":1024},"open":12},"info":{"ephemeral_id":"dccb5f83-6558-4cf3-aca7-c38cad78c921","name":"metricbeat","uptime":{"ms":30068},"version":"8.6.2"},"memstats":{"gc_next":25042072,"memory_alloc":13672288,"memory_sys":37831688,"memory_total":65798752,"rss":141361152},"runtime":{"goroutines":62}},"libbeat":{"config":{"module":{"running":4,"starts":4},"reloads":1,"scans":1},"output":{"events":{"active":0},"type":"elasticsearch","write":{"bytes":1555}},"pipeline":{"clients":12,"events":{"active":55,"published":55,"retry":92,"total":55},"queue":{"max_events":4096}}},"metricbeat":{"elasticsearch":{"node":{"events":3,"failures":3},"node_stats":{"events":3,"failures":3}},"system":{"cpu":{"events":4,"success":4},"filesystem":{"events":2,"success":2},"fsstat":{"events":1,"success":1},"load":{"events":3,"success":3},"memory":{"events":3,"success":3},"network":{"events":8,"success":8},"process":{"events":22,"success":22},"process_summary":{"events":3,"success":3},"socket_summary":{"events":3,"success":3},"uptime":{"events":1,"success":1}}},"system":{"cpu":{"cores":2},"load":{"1":0.44,"15":0.24,"5":0.34,"norm":{"1":0.22,"15":0.12,"5":0.17}}}},"ecs.version":"1.6.0"}}
May 16 08:04:30 elastic metricbeat[23846]: {"log.level":"error","@timestamp":"2023-05-16T08:04:30.955-0500","log.origin":{"file.name":"module/wrapper.go","file.line":256},"message":"Error fetching data for metricset elasticsearch.node_stats: error making http request: Get \"http://localhost:9200/_nodes/_local/stats\": EOF","service.name":"metricbeat","ecs.version":"1.6.0"}
May 16 08:04:30 elastic metricbeat[23846]: {"log.level":"error","@timestamp":"2023-05-16T08:04:30.956-0500","log.origin":{"file.name":"module/wrapper.go","file.line":256},"message":"Error fetching data for metricset elasticsearch.node: error making http request: Get \"http://localhost:9200/_nodes/_local\": EOF","service.name":"metricbeat","ecs.version":"1.6.0"}
May 16 08:04:40 elastic metricbeat[23846]: {"log.level":"error","@timestamp":"2023-05-16T08:04:40.955-0500","log.origin":{"file.name":"module/wrapper.go","file.line":256},"message":"Error fetching data for metricset elasticsearch.node_stats: error making http request: Get \"http://localhost:9200/_nodes/_local/stats\": EOF","service.name":"metricbeat","ecs.version":"1.6.0"}
May 16 08:04:40 elastic metricbeat[23846]: {"log.level":"error","@timestamp":"2023-05-16T08:04:40.956-0500","log.origin":{"file.name":"module/wrapper.go","file.line":256},"message":"Error fetching data for metricset elasticsearch.node: error making http request: Get \"http://localhost:9200/_nodes/_local\": EOF","service.name":"metricbeat","ecs.version":"1.6.0"}
Need another pair of eyes!