Using filebeat agent for system and apache modules . the output directly forwarded to elasticsearch then getting the preconfigured kibana dashboards. Now i want to monitor the vsftpd daemon logs and parse it.
I am struck as we can't configure multiple outputs in filebeat . what is the way to do it?
Logfile location
/var/log/vsftpd/vsftpd.log and i have the grok patterns
I'm wondering, why do you need multiple outputs? Do you plan to send the vsftpd logs to a different cluster? Please explain a little bit more about your use case
Currenrtly i use filebeat modules system & apache and forwarding to elasticsearch directly . Now custom requirement i want to parse the FTP logs and build the dashboard. it's recommended that i use the flow of filebeat -> logstash -> elasticsearch or filbeat -> ingest node for FTP logs ( create new filbeat module VSFTP) -> elasticsearch ?
i would prefer to have a new filebeat module created for vsftpd and then forward elasticsearch. if it's more time consuming than logstash then i will move over logstash oriented workflow.
It would be really nice if you contribute it to the beats repository, so more people can benefit from it. Also if you have questions during the process, don't hesitate to ask them in our discuss forum for developers: https://discuss.elastic.co/c/beats/libbeat
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.