Hi Team,
I have installed filebeat
on a server which is having IP as 192.168.x.x
and filebeat
is trying to send events to two logstash
servers which are having IPs as 10.20.x.x
.
Currently i see filebeat
service is failing continuously and unable to send logs to logstash
and failing to connect to kibana
as well.
Filebeat.yml
- type: log
fields_under_root: true
fields:
log_type: federate_server1
app_id: fs
multiline.pattern: ^[[:space:]]+(at|\.{3})\b|^Caused by:|^java|^...|^-
multiline.negate: true
multiline.match: after
paths:
- /opt/federate/log/*
processors:
- add_host_metadata: ~
- add_cloud_metadata: ~
setup.dashboards.enabled: true
setup.kibana:
host: "http://10.20.x.1:5601"
username: elastic
password: ${es_pwd}
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: true
output.logstash:
hosts: ['10.20.x.1:5044', '10.20.x.2:5044']
loadbalance: true
[root@ ~]#
error logs
Sep 24 19:40:48 <hostname> filebeat[16125]: 2021-09-24T19:40:48.992+0300 ERROR instance/beat.go:989 Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://10.20.x.1:5601/api/status fails: fail to execute the HTTP GET request: Get "http://10.20.x.1:5601/api/status": context deadline exceeded (Client.Timeout exceeded while awaiting headers). Response: .
Sep 24 19:40:48 <hostname> systemd[1]: Unit filebeat.service entered failed state.
Sep 24 19:40:48 <hostname> systemd[1]: filebeat.service failed.
Sep 24 19:42:46 <hostname> heartbeat: 2021-09-24T19:42:46.115+0300#011ERROR#011[publisher_pipeline_output]#011pipeline/output.go:154#011Failed to connect to backoff(async(tcp://10.20.x.2:5044)): dial tcp 10.20.x.2:5044: i/o timeout
Sep 24 19:43:11 <hostname> heartbeat: 2021-09-
Sep 24 19:43:27 <hostname> metricbeat: Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://10.20.x.1:5601/api/status fails: fail to execute the HTTP GET request: Get "http://10.20.x.1:5601/api/status": dial tcp 10.20.x.1:5601: i/o timeout (Client.Timeout exceeded while awaiting headers). Response: .
complete logs -
root@<hostname> ~]# systemctl restart filebeat;journalctl -fu filebeat
-- Logs begin at Wed 2021-08-18 14:14:36 +03. --
Sep 24 19:40:48 <hostname> filebeat[16125]: 2021-09-24T19:40:48.992+0300 INFO [monitoring] log/log.go:154 Uptime: 1m30.091989299s
Sep 24 19:40:48 <hostname> filebeat[16125]: 2021-09-24T19:40:48.992+0300 INFO [monitoring] log/log.go:131 Stopping metrics logging.
Sep 24 19:40:48 <hostname> filebeat[16125]: 2021-09-24T19:40:48.992+0300 INFO instance/beat.go:470 filebeat stopped.
Sep 24 19:40:48 <hostname> filebeat[16125]: 2021-09-24T19:40:48.992+0300 ERROR instance/beat.go:989 Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://10.20.x.1:5601/api/status fails: fail to execute the HTTP GET request: Get "http://10.20.x.1:5601/api/status": context deadline exceeded (Client.Timeout exceeded while awaiting headers). Response: .
Sep 24 19:40:48 <hostname> filebeat[16125]: Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://10.20.x.1:5601/api/status fails: fail to execute the HTTP GET request: Get "http://10.20.x.1:5601/api/status": context deadline exceeded (Client.Timeout exceeded while awaiting headers). Response: .
Sep 24 19:40:48 <hostname> systemd[1]: filebeat.service: main process exited, code=exited, status=1/FAILURE
Sep 24 19:40:48 <hostname> systemd[1]: Stopped Filebeat sends log files to Logstash or directly to Elasticsearch..
Sep 24 19:40:48 <hostname> systemd[1]: Unit filebeat.service entered failed state.
Sep 24 19:40:48 <hostname> systemd[1]: filebeat.service failed.
Sep 24 19:40:48 <hostname> systemd[1]: Started Filebeat sends log files to Logstash or directly to Elasticsearch..
Sep 24 19:40:49 <hostname> filebeat[16757]: 2021-09-24T19:40:49.141
Sep 24 19:42:19 <hostname> filebeat[16757]: 2021-09-24T19:42:19.160+0300 ERROR instance/beat.go:989 Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://10.20.x.1:5601/api/status fails: fail to execute the HTTP GET request: Get "http://10.20.x.1:5601/api/status": context deadline exceeded (Client.Timeout exceeded while awaiting headers). Response: .
Sep 24 19:42:19 <hostname> filebeat[16757]: Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://10.20.x.1:5601/api/status fails: fail to execute the HTTP GET request: Get "http://10.20.x.1:5601/api/status": context deadline exceeded (Client.Timeout exceeded while awaiting headers). Response: .
Sep 24 19:42:19 <hostname> systemd[1]: filebeat.service: main process exited, code=exited, status=1/FAILURE
Sep 24 19:42:19 <hostname> systemd[1]: Unit filebeat.service entered failed state.
Sep 24 19:42:19 <hostname> systemd[1]: filebeat.service failed.
Sep 24 19:42:19 <hostname> systemd[1]: filebeat.service holdoff time over, scheduling restart.
Sep 24 19:42:19 <hostname> systemd[1]: Stopped Filebeat sends log files to Logstash or directly to Elasticsearch..
Sep 24 19:42:19 <hostname> systemd[1]: Started Filebeat sends log files to Logstash or directly to Elasticsearch..
Sep 24 19:42:19 <hostname> filebeat[17279]: 2021-09-24T19:42:19.456+0300 INFO instance/beat.go:665 Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]
Sep 24 19:42:19 <hostname> filebeat[17279]: 2021-09-24T19:42:19.460+0300 INFO [beat] instance/beat.go:1030 Host info {"system_info": {"host": {"architecture":"x86_64","boot_time":"2021-08-18T14:14:34+03:00","containerized":false,"name":"<hostname>","ip":["127.0.0.1/8","::1/128","192.168.x.x/24","fe80::50f9:284e:240d:8471/64"],"kernel_version":"3.10.0-1160.36.2.el7.x86_64","mac":["ec:eb:b8:98:a2:2c","ec:eb:b8:98:a2:2d","ec:eb:b8:98:a2:2e","ec:eb:b8:98:a2:2f"],"os":{"type":"linux","family":"redhat","platform":"rhel","name":"Red Hat Enterprise Linux Server","version":"7.9 (Maipo)","major":7,"minor":9,"patch":0,"codename":"Maipo"},"timezone":"+03","timezone_offset_sec":10800,"id":"c010ccef34dc4f06bb8861c24e7ea9ad"}}}
Sep 24 19:42:19 <hostname> filebeat[17279]: 2021-09-24T19:42:19.461+0300 INFO instance/beat.go:309 Setup Beat: filebeat; Version: 7.14.0
Sep 24 19:42:19 <hostname> filebeat[17279]: 2021-09-24T19:42:19.461+0300 INFO [publisher] pipeline/module.go:113 Beat name: server1
Sep 24 19:42:19 <hostname> filebeat[17279]: 2021-09-24T19:42:19.462+0300 WARN beater/filebeat.go:178 Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
Sep 24 19:42:19 <hostname> filebeat[17279]: 2021-09-24T19:42:19.462+0300 INFO [monitoring] log/log.go:118 Starting metrics logging every 30s
Sep 24 19:42:19 <hostname> filebeat[17279]: 2021-09-24T19:42:19.462+0300 INFO kibana/client.go:122 Kibana url: http://10.20.x.1:5601
Sep 24 19:42:22 <hostname> filebeat[17279]: 2021-09-24T19:42:22.460+0300 INFO [add_cloud_metadata] add_cloud_metadata/add_cloud_metadata.go:101 add_cloud_metadata: hosting provider type not detected.
Sep 24 19:42:49 <hostname> filebeat[17279]: 2021-09-24T19:42:49.470+0300 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":60,"time":{"ms":61}},"total":{"ticks":270,"time":{"ms":272},"value":270},"user":{"ticks":210,"time":{"ms":211}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":10},"info":{"ephemeral_id":"3abca521-e970-411d-956c-413490ed2643","uptime":{"ms":30073},"version":"7.14.0"},"memstats":{"gc_next":18712048,"memory_alloc":11335152,"memory_sys":78988296,"memory_total":55313336,"rss":105451520},"runtime":{"goroutines":16}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"active":0},"type":"logstash"},"pipeline":{"clients":0,"events":{"active":0},"queue":{"max_events":4096}}},"registrar":{"states":{"current":0}},"system":{"cpu":{"cores":32},"load":{"1":0.03,"15":0.05,"5":0.02,"norm":{"1":0.0009,"15":0.0016,"5":0.0006}}}}}}
+0300 INFO instance/beat.go:665 Home path: [/usr/share/filebeat]
Sep 24 19:42:19 <hostname> filebeat: 2021-09-24T19:42:19.160+0300#011ERROR#011instance/beat.go:989#011Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://10.20.x.1:5601/api/status fails: fail to execute the HTTP GET request: Get "http://10.20.x.1:5601/api/status": context deadline exceeded (Client.Timeout exceeded while awaiting headers). Response: .
Sep 24 19:42:19 <hostname> filebeat: Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://10.20.x.1:5601/api/status fails: fail to execute the HTTP GET request: Get "http://10.20.x.1:5601/api/status": context deadline exceeded (Client.Timeout exceeded while awaiting headers). Response: .
Sep 24 19:42:46 <hostname> heartbeat: 2021-09-24T19:42:46.115+0300#011ERROR#011[publisher_pipeline_output]#011pipeline/output.go:154#011Failed to connect to backoff(async(tcp://10.20.x.2:5044)): dial tcp 10.20.x.2:5044: i/o timeout
Sep 24 19:43:11 <hostname> heartbeat: 2021-09-24T19:43:11.412+0300#011ERROR#011[publisher_pipeline_output]#011pipeline/output.go:154#011Failed to connect to backoff(async(tcp://10.20.x.1:5044)): dial tcp 10.20.x.1:5044: i/o timeout
Sep 24 19:43:27 <hostname> metricbeat: 2021-09-24T19:43:27.573+0300#011ERROR#011instance/beat.go:989#011Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://10.20.x.1:5601/api/status fails: fail to execute the HTTP GET request: Get "http://10.20.x.1:5601/api/status": dial tcp 10.20.x.1:5601: i/o timeout (Client.Timeout exceeded while awaiting headers). Response: .
Sep 24 19:43:27 <hostname> metricbeat: Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://10.20.x.1:5601/api/status fails: fail to execute the HTTP GET request: Get "http://10.20.x.1:5601/api/status": dial tcp 10.20.x.1:5601: i/o timeout (Client.Timeout exceeded while awaiting headers). Response: .
Sep 24 19:43:49 <hostname> filebeat: 2021-09-24T19:43:49.475+0300#011ERROR#011instance/beat.go:989#011Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://10.20.x.1:5601/api/status fails: fail to execute the HTTP GET request: Get "http://10.20.x.1:5601/api/status": context deadline exceeded (Client.Timeout exceeded while awaiting headers). Response: .
Sep 24 19:43:49 <hostname> filebeat: Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://10.20.x.1:5601/api/status fails: fail to execute the HTTP GET request: Get "http://10.20.x.1:5601/api/status": context deadline exceeded (Client.Timeout exceeded while awaiting headers). Response: .
Sep 24 19:44:01 <hostname> heartbeat: 2021-09-24T19:44:01.699+0300#011ERROR#011[publisher_pipeline_output]#011pipeline/output.go:154#011Failed to connect to backoff(async(tcp://10.20.x.2:5044)): dial tcp 10.20.x.2:5044: i/o timeout
Checking connection from 192.168.x.x
server to logstash/kibana
server,
[root@ ~]# nc -v 10.20.x.1 5044
Ncat: Version 7.50 ( https://nmap.org/ncat )
Ncat: Connection timed out.
[root@ ~]# nc -v 10.20.x.2 5044
Ncat: Version 7.50 ( https://nmap.org/ncat )
Ncat: Connection timed out.
[root@ ~]# nc -v 10.20.x.1 5601
Ncat: Version 7.50 ( https://nmap.org/ncat )
Ncat: Connection timed out.
[root@ ~]# nc -v 10.20.x.2 5601
Ncat: Version 7.50 ( https://nmap.org/ncat )
Ncat: Connection timed out.
[root@ ~]# nc -v 10.20.x.1 9200
Ncat: Version 7.50 ( https://nmap.org/ncat )
Ncat: Connection timed out.
[root@ ~]# nc -v 10.20.x.2 9200
Ncat: Version 7.50 ( https://nmap.org/ncat )
Ncat: Connection timed out.
- It's clear that I need to open port
5044
,5601
fromfilebeat
server to twologstash
servers but do i need to openelasticsearch
port9200
also?
Is there anything else required apart from port
opening for filebeat
agent to successfully send events to logstash
servers and to connect to kibana
?
I need to send port opening request and do not want to miss anything in that.
- I have
kibana
installed on two servers. Do I need to mentioned bothkibana
IPs in abovefilebeat.yml
file, the same way twologstash
IPs are mentioned ?
Thanks,