it says i have to use filebeat (i run fully in docker)
have port 2055 open on the docker filebeat container
So do I have to set up a filebeat docker container with the NetFlow module and have my firewalls forward its netflow logs to the filebeat docker container?
And in the filebeat conf file just have its output be the elasticsearch server and not my logstash server?
Welcome to the club. I give up on netflow as nothing worked. and I don't think community here knows about it. or I didn't got any headway. I spend hours trying to figure out.
Haven't tried it myself yet (still sitting on a looooong todo list) but I think you read that right. You need a filebeat with the netflow module enabled and configured.
As you mentioned you are running full docker that would mean you need a container in which filebeat is running and you forward the port you configure (2055) into the container.
The output configuration depends if you need additional message enrichment. I would assume that at least in the beginning you would set your elasticsearch cluster as output. Because there is a
filebeat setup -e
command. That will create the index templates in Kibana, some dashboard and it looks like it also deploys an ingest pipeline in elasticsearch. That ingest pipeline will then do the pattern matching etc.
Logstash output could be used if you need any additional data enrichment not given out of the box from the ingest pipeline. But I would thing in your case elasticsearch output will probably what you're after.
Is there something not working? If yes could you give us some more details of what exactly is not working? Log entries etc?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.