Ok, here is what happened.
Originally we had a pipeline to receive logs from our firewalls devices to our ELK stack, and everything was working perfect.
Then we will like to add, to the same logstash server, the Netflow module.
So I went through the Netflow module configuration as suggested in https://www.elastic.co/guide/en/logstash/current/netflow-module.html#configuring-netflow
I set up the logstash.yml file with the module parameters we need, according to our configurations, stopped the logstash service using systemctl stop logstash, and finally ran
bin/logstash --modules netflow --setup --path.settings /etc/logstash
Everything seems to work, I went to my Kibana, search for the Index pattern of Netflow-* and there it was, the only thing was that it wasn't getting data.
went back to my logstash server and realized the logstash service was not running, used the systemctl start logstash and only getting this:
Sep 25 12:06:07 XXXXX.com systemd: Started logstash.
Sep 25 12:06:07 XXXXX.com systemd: Starting logstash...
Sep 25 12:06:44 XXXXX.com systemd: Stopping logstash...
Sep 25 12:06:44 XXXXX.com systemd: logstash.service: main process exited, code=exited, status=143/n/a
Sep 25 12:06:44 XXXXX.com systemd: Stopped logstash.
Sep 25 12:06:44 XXXXX.com systemd: Unit logstash.service entered failed state.
Sep 25 12:06:44 XXXXX.com systemd: logstash.service failed.
Checked the logstash log from /var/log/logstash/logstash-plain.log and I am getting this
[2019-09-25T00:13:15,285][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
[2019-09-25T00:14:16,027][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-09-25T00:14:16,044][FATAL][logstash.runner ] Logstash could not be started because there is already another instance using the configured data directory. If you wish to run multiple instances
so NOW I am not getting the data from Netflow and not getting the data from my working Pipeline.