Elastic Search Firewall Intergrations Issue

Am having an issue with getting the elastic agent to start listening for Firewall (checkpoint & Cisco) syslog on a specific udp port.

My setup is as follows:
Linux server which has - Elastic Search, Kibana and Elastic Agent.
Firewalls are set to send logs to this server IP via different udp ports. (Logs are getting to the server as I can see them when I start up Logstash).

I have applied a policy to the agent that contains the cisco and checkpoint integration and configured the ports as per the firewall configuration.

When I run ss -lpun command I can't see the server listening to these ports.

Hello and welcome,

Please share your Elastic Agent configuration, screenshot them from Kibana if possible.

Also, did you check the Elastic Agent logs? If it is not listening on any ports, then the input had some issue starting up.

You mention that you have logstash as well, is this on the same server?

@leandrojmp here is the additional info:

Apparent, am not getting any logs from kibana. Here is what I get via Diagnostics.

{"log.level":"info","@timestamp":"2024-04-30T12:02:55.394Z","message":"filebeat start running.","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"udp-default","type":"udp"},"log":{"source":"udp-default"},"ecs.version":"1.6.0","log.origin":{"file.line":520,"file.name":"instance/beat.go","function":"github.com/elastic/beats/v7/libbeat/cmd/instance.(*Beat).launch"},"service.name":"filebeat","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-04-30T12:02:55.395Z","message":"Finished loading transaction log file for '/opt/Elastic/Agent/data/elastic-agent-8.13.1-379dfc/run/udp-default/registry/filebeat'. Active transaction id=0","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"udp-default","type":"udp"},"log":{"source":"udp-default"},"log.origin":{"file.line":134,"file.name":"memlog/store.go","function":"github.com/elastic/beats/v7/libbeat/statestore/backend/memlog.openStore"},"service.name":"filebeat","ecs.version":"1.6.0","ecs.version":"1.6.0"}
{"log.level":"warn","@timestamp":"2024-04-30T12:02:55.395Z","message":"Filebeat is unable to load the ingest pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the ingest pipelines or are using Logstash pipelines, you can ignore this warning.","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"udp-default","type":"udp"},"log":{"source":"udp-default"},"log.origin":{"file.line":331,"file.name":"beater/filebeat.go","function":"github.com/elastic/beats/v7/filebeat/beater.(*Filebeat).Run"},"service.name":"filebeat","ecs.version":"1.6.0","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-04-30T12:02:55.395Z","message":"creating new InputManager","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"udp-default","type":"udp"},"log":{"source":"udp-default"},"log.origin":{"file.line":55,"file.name":"shipper/input.go","function":"github.com/elastic/beats/v7/x-pack/filebeat/input/shipper.NewInputManager"},"service.name":"filebeat","ecs.version":"1.6.0","log.logger":"input","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-04-30T12:02:55.395Z","message":"States Loaded from registrar: 0","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"udp-default","type":"udp"},"log":{"source":"udp-default"},"log.logger":"registrar","log.origin":{"file.line":107,"file.name":"registrar/registrar.go","function":"github.com/elastic/beats/v7/filebeat/registrar.(*Registrar).loadStates"},"service.name":"filebeat","ecs.version":"1.6.0","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-04-30T12:02:55.395Z","message":"Loading Inputs: 0","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"udp-default","type":"udp"},"log":{"source":"udp-default"},"log.logger":"crawler","log.origin":{"file.line":71,"file.name":"beater/crawler.go","function":"github.com/elastic/beats/v7/filebeat/beater.(*crawler).Start"},"service.name":"filebeat","ecs.version":"1.6.0","ecs.version":"1.6.0"}


{"log.level":"error","@timestamp":"2024-04-30T12:02:59.194Z","message":"add_cloud_metadata: received error failed requesting hetzner metadata: Get \"http://169.254.169.254/hetzner/v1/metadata/instance-id\": dial tcp 169.254.169.254:80: connect: connection refused","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"service.name":"metricbeat","ecs.version":"1.6.0","log.logger":"add_cloud_metadata","log.origin":{"file.line":173,"file.name":"add_cloud_metadata/providers.go","function":"github.com/elastic/beats/v7/libbeat/processors/add_cloud_metadata.(*addCloudMetadata).fetchMetadata"},"ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-04-30T12:02:59.194Z","message":"running under elastic-agent, per-beat lockfiles disabled","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"log.origin":{"file.line":436,"file.name":"instance/beat.go","function":"github.com/elastic/beats/v7/libbeat/cmd/instance.(*Beat).launch"},"service.name":"metricbeat","ecs.version":"1.6.0","ecs.version":"1.6.0"}
{"log.level":"error","@timestamp":"2024-04-30T12:02:59.194Z","message":"add_cloud_metadata: received error failed requesting gcp metadata: Get \"http://169.254.169.254/computeMetadata/v1/?recursive=true&alt=json\": dial tcp 169.254.169.254:80: connect: connection refused","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"service.name":"metricbeat","ecs.version":"1.6.0","log.logger":"add_cloud_metadata","log.origin":{"file.line":173,"file.name":"add_cloud_metadata/providers.go","function":"github.com/elastic/beats/v7/libbeat/processors/add_cloud_metadata.(*addCloudMetadata).fetchMetadata"},"ecs.version":"1.6.0"}
{"log.level":"error","@timestamp":"2024-04-30T12:02:59.195Z","message":"add_cloud_metadata: received error failed requesting digitalocean metadata: Get \"http://169.254.169.254/metadata/v1.json\": dial tcp 169.254.169.254:80: connect: connection refused","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"log.logger":"add_cloud_metadata","log.origin":{"file.line":173,"file.name":"add_cloud_metadata/providers.go","function":"github.com/elastic/beats/v7/libbeat/processors/add_cloud_metadata.(*addCloudMetadata).fetchMetadata"},"service.name":"metricbeat","ecs.version":"1.6.0","ecs.version":"1.6.0"}
{"log.level":"error","@timestamp":"2024-04-30T12:02:59.195Z","message":"add_cloud_metadata: received error failed requesting azure metadata: Get \"http://169.254.169.254/metadata/instance/compute?api-version=2021-02-01\": dial tcp 169.254.169.254:80: connect: connection refused","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"log.origin":{"file.line":173,"file.name":"add_cloud_metadata/providers.go","function":"github.com/elastic/beats/v7/libbeat/processors/add_cloud_metadata.(*addCloudMetadata).fetchMetadata"},"service.name":"metricbeat","ecs.version":"1.6.0","log.logger":"add_cloud_metadata","ecs.version":"1.6.0"}
{"log.level":"error","@timestamp":"2024-04-30T12:02:59.195Z","message":"add_cloud_metadata: received error failed requesting openstack metadata: Get \"https://169.254.169.254/2009-04-04/meta-data/placement/availability-zone\": dial tcp 169.254.169.254:443: connect: connection refused","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"service.name":"metricbeat","ecs.version":"1.6.0","log.logger":"add_cloud_metadata","log.origin":{"file.line":173,"file.name":"add_cloud_metadata/providers.go","function":"github.com/elastic/beats/v7/libbeat/processors/add_cloud_metadata.(*addCloudMetadata).fetchMetadata"},"ecs.version":"1.6.0"}
{"log.level":"error","@timestamp":"2024-04-30T12:02:59.195Z","message":"add_cloud_metadata: received error failed requesting openstack metadata: Get \"http://169.254.169.254/2009-04-04/meta-data/placement/availability-zone\": dial tcp 169.254.169.254:80: connect: connection refused","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"log.logger":"add_cloud_metadata","log.origin":{"file.line":173,"file.name":"add_cloud_metadata/providers.go","function":"github.com/elastic/beats/v7/libbeat/processors/add_cloud_metadata.(*addCloudMetadata).fetchMetadata"},"service.name":"metricbeat","ecs.version":"1.6.0","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-04-30T12:02:59.195Z","message":"Starting stats endpoint","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"service.name":"metricbeat","ecs.version":"1.6.0","log.logger":"api","log.origin":{"file.line":69,"file.name":"api/server.go","function":"github.com/elastic/beats/v7/libbeat/api.(*Server).Start"},"ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-04-30T12:02:59.195Z","message":"Metrics endpoint listening on: /opt/Elastic/Agent/data/tmp/akSPbdqgaHaTY0_J01-dsfYK6JpMz2zn.sock (configured: unix:///opt/Elastic/Agent/data/tmp/akSPbdqgaHaTY0_J01-dsfYK6JpMz2zn.sock)","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"log.logger":"api","log.origin":{"file.line":71,"file.name":"api/server.go","function":"github.com/elastic/beats/v7/libbeat/api.(*Server).Start.func1"},"service.name":"metricbeat","ecs.version":"1.6.0","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-04-30T12:02:59.196Z","message":"Syscall filter successfully installed","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"service.name":"metricbeat","ecs.version":"1.6.0","log.logger":"seccomp","log.origin":{"file.line":125,"file.name":"seccomp/seccomp.go","function":"github.com/elastic/beats/v7/libbeat/common/seccomp.loadFilter"},"ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-04-30T12:02:59.196Z","message":"Beat info","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"log.logger":"beat","log.origin":{"file.line":1365,"file.name":"instance/beat.go","function":"github.com/elastic/beats/v7/libbeat/cmd/instance.logSystemInfo"},"service.name":"metricbeat","system_info":{"beat":{"path":{"config":"/opt/Elastic/Agent/data/elastic-agent-8.13.1-379dfc/components","data":"/opt/Elastic/Agent/data/elastic-agent-8.13.1-379dfc/run/http/metrics-monitoring","home":"/opt/Elastic/Agent/data/elastic-agent-8.13.1-379dfc/components","logs":"/opt/Elastic/Agent/data/elastic-agent-8.13.1-379dfc/components/logs"},"type":"metricbeat","uuid":"2f819882-8ec5-4ddd-8c7c-e8ddff6e480f"},"ecs.version":"1.6.0"},"ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-04-30T12:02:59.196Z","message":"Build info","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"system_info":{"build":{"commit":"e9e462d71bdcd33a84d7f51753a116b5d418938f","libbeat":"8.13.1","time":"2024-03-27T15:40:21.000Z","version":"8.13.1"},"ecs.version":"1.6.0"},"log.logger":"beat","log.origin":{"file.line":1374,"file.name":"instance/beat.go","function":"github.com/elastic/beats/v7/libbeat/cmd/instance.logSystemInfo"},"service.name":"metricbeat","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-04-30T12:02:59.196Z","message":"Go runtime info","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"log.origin":{"file.line":1377,"file.name":"instance/beat.go","function":"github.com/elastic/beats/v7/libbeat/cmd/instance.logSystemInfo"},"service.name":"metricbeat","system_info":{"ecs.version":"1.6.0","go":{"arch":"amd64","max_procs":4,"os":"linux","version":"go1.21.8"}},"log.logger":"beat","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2024-04-30T12:02:59.198Z","message":"Host info","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"log.logger":"beat","log.origin":{"file.line":1383,"file.name":"instance/beat.go","function":"github.com/elastic/beats/v7/libbeat/cmd/instance.logSystemInfo"},"service.name":"metricbeat","system_info":{"ecs.version":"1.6.0","host":{"architecture":"x86_64","boot_time":"2024-04-02T13:01:48+03:00","containerized":false,"id":"60645e37f21b41caa7da518fbefd9a3c","ip":["127.0.0.1","::1","172.16.45.190","fe80::20c:29ff:fe14:ff89"],"kernel_version":"4.18.0-513.11.1.el8_9.x86_64","mac":["00:0c:29:14:ff:89"],"name":"ke-cha-sec-es-srv","os":{"codename":"Midnight Oncilla","family":"redhat","major":8,"minor":9,"name":"AlmaLinux","patch":0,"platform":"almalinux","type":"linux","version":"8.9 (Midnight Oncilla)"},"timezone":"EAT","timezone_offset_sec":10800}},"ecs.version":"1.6.0"}

{"log.level":"error","@timestamp":"2024-04-30T12:03:00.301Z","message":"Failed reading CA certificate: open : no such file or directory","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"service.name":"metricbeat","ecs.version":"1.6.0","log.logger":"tls","log.origin":{"file.line":216,"file.name":"tlscommon/tls.go","function":"github.com/elastic/elastic-agent-libs/transport/tlscommon.LoadCertificateAuthorities"},"ecs.version":"1.6.0"}
{"log.level":"error","@timestamp":"2024-04-30T12:03:00.301Z","message":"could not start output","component":{"binary":"metricbeat","dataset":"elastic_agent.metricbeat","id":"http/metrics-monitoring","type":"http/metrics"},"log":{"source":"http/metrics-monitoring"},"log.origin":{"file.line":631,"file.name":"management/managerV2.go","function":"github.com/elastic/beats/v7/x-pack/libbeat/management.(*BeatV2Manager).reload"},"service.name":"metricbeat","error":{"message":"failed to reload output: open : no such file or directory reading <nil> accessing 'elasticsearch'"},"ecs.version":"1.6.0","log.logger":"centralmgmt.V2-manager","ecs.version":"1.6.0"}
{"log.level":"error","@timestamp":"2024-04-30T12:03:00.302Z","log.origin":{"file.name":"coordinator/coordinator.go","file.line":624},"message":"Unit state changed http/metrics-monitoring (STARTING->FAILED): could not start output: failed to reload output: open : no such file or directory reading <nil> accessing 'elasticsearch'","log":{"source":"elastic-agent"},"component":{"id":"http/metrics-monitoring","state":"HEALTHY"},"unit":{"id":"http/metrics-monitoring","type":"output","state":"FAILED","old_state":"STARTING"},"ecs.version":"1.6.0"}

Yes, I have Logstash on the server, but it is not running.

I did update the CA cert under Fleet > Settings > Outputs > Elasticsearch and all the issues were resolved.