How to ship custom log files to elastic?

I am trying to ship custom log files generated by my own application. The format of the log files follows syslog.
I tried to find out how to ship custom log files (and not syslog or nginx, etc.), but didn't succeed. Any tips? Do I need to write a custom module file?

It depends how much processing you want to do, and how well your format matches syslog. If you don't care about structuring the data and just need raw text search, then you can just use a log input and each entry will go into the index as a line of text. If you want to try parsing it as syslog, you can use the syslog module, and override the default file paths (see filebeat/modules.d/system.yml). Beyond that, you'd need to configure processors, which can be done in a variety of ways (you can extract some data with built-in processors and regular expressions, or if you need more advanced processing you can define a processor with custom javascript).

Thanks so much for your answer! Would I need to take the same steps if I wanted to ship logs to Logstash first, then to Elastic?

Similar, though Logstash has its own configuration for processing incoming events. You can easily pass things through unchanged from Filebeat to Logstash to Elasticsearch if Filebeat is already doing everything you want, or you can add some final processing in the Logstash layer if there's something that's hard to handle in Filebeat configuration. The basic setup is the same either way, all that should change is which processing steps are run on Filebeat vs Logstash.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.