Hello Everyone,
I am in the process of setting up the ELK stack. I have already successfully integrated some windows server machines (through agents), some linux hosts and our switches. I am running the latest version of the ELK stack and of the agents.
For the past week I have been failing to successfully integrate our barracuda cloudgen firewalls. (Hence the Friday afternoon post of despair)
My understanding is, that I can send data directly to agent as documented [1]
These steps were written with a Logstash server as the intended destination, and where it references the "Hostname" use the address and port of the Elastic Agent that is running this integration. Logstash is not used as part of this integration.
Steps taken so far
- configure the firewall to send data to the elastic agent as documented
- Install the barracuda cloudgen integration on the correct policy which got rolled out to the agent (Double checked in the logs of the agent and
barracuda_cloudgen_firewall
lines are appearing) - Running tcpdump on port 5044 (as configured per default integration) shows that data is incoming, various LLM suggest that there is a protocol mismatch going on here anonymized tcpdump [3]
- Listen address in the integration is set to 0.0.0.0
- Running
sudo ss -tulnp | grep 5044
shows that the agent is listening on the port- result is
tcp LISTEN 0 4096 *:5044 *:* users:(("agentbeat",pid=877,fd=12))
- result is
- Reinstalled / reconfigured the agent from scratch to no avail
No matter the action, I cannot see data coming into elasticsearch / can't see it in kibana.
Any help is greatly appreciated.
[1] www dot elastic dot co/docs/reference/integrations/barracuda_cloudgen_firewall
[2] campus dot barracuda dot com/product/cloudgenfirewall/doc/96025953/how-to-enable-filebeat-stream-to-a-logstash-pipeline/
[3] pastes dot io/anonymized-tcpdump
P.S Had to replace "." with "dot" as I cannot post links as a new user