Filebeat parsing ftp logs


(vijay kannan) #1

Hi All,

Using filebeat agent for system and apache modules . the output directly forwarded to elasticsearch then getting the preconfigured kibana dashboards. Now i want to monitor the vsftpd daemon logs and parse it.

I am struck as we can't configure multiple outputs in filebeat . what is the way to do it?

Logfile location

/var/log/vsftpd/vsftpd.log and i have the grok patterns


(Carlos Pérez Aradros) #2

Hi @Vijayakumar_Kannan,

I'm wondering, why do you need multiple outputs? Do you plan to send the vsftpd logs to a different cluster? Please explain a little bit more about your use case


(vijay kannan) #3

Currenrtly i use filebeat modules system & apache and forwarding to elasticsearch directly . Now custom requirement i want to parse the FTP logs and build the dashboard. it's recommended that i use the flow of filebeat -> logstash -> elasticsearch or filbeat -> ingest node for FTP logs ( create new filbeat module VSFTP) -> elasticsearch ?

i would prefer to have a new filebeat module created for vsftpd and then forward elasticsearch. if it's more time consuming than logstash then i will move over logstash oriented workflow.

Any guidance for filebeat module creation ?


(Carlos Pérez Aradros) #4

Awesome!

I would say both approaches have similar complexity, if you want to go for the module, these are the docs on how to do it! https://www.elastic.co/guide/en/beats/devguide/current/filebeat-modules-devguide.html

It would be really nice if you contribute it to the beats repository, so more people can benefit from it. Also if you have questions during the process, don't hesitate to ask them in our discuss forum for developers: https://discuss.elastic.co/c/beats/libbeat

Best regrads


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.