Filebeat to send application logs to elastic search

Hi,
I want to use filebeat to ship my application logs to elastic search running in a log server. My application generates a log file in a particular folder. whenever the log file reaches a configured size, then a new log file is started like appTrace.001, appTrace.002, appTrace.003 etc
Can i make filebeat read new files whenever the new file is started for tracing and close the old file. Also, is there a possibility i can only send only a subset of traces from these application log files to elasticsearch or does it require logstash to do this?

Yes, Filebeat can automatically do this. Just point your Filebeat configuration to /path/to/your/logs/folder/* (the * at the end will help Filebeat pick up new files as they are created).

This might depend on how you want to determine the subset. Do you want to select the subset of log lines based on some textual pattern? Or are you looking for more random sampling? Or something else?

Hi Shaunak,
Thanks for the response.
Yes, the selection of subset will be based on some pattern. how do i achieve pattern match on filebeat.?
Also, once these pattern match the lines in the log file, to show on kibana should i use logstash or filebeat can directly show on kibana. I think even if filebeat can directly push to elasticsearch, the representation on kibana will be very crude. Should i be creating some dashboard for beautifying the log data on kibana?

~Neeraj

Look into the exclude_lines and include_lines options here: Log input | Filebeat Reference [8.11] | Elastic. It sounds like those are what you need.

I would recommend starting out by having Filebeat directly indexing into Elasticsearch (that is, no Logstash in the middle). Then you can go into Kibana Discover and take a look at your data. If it's sufficiently parsed out into distinct fields, you can use Kibana Visualize and Kibana Dashboard to create dashboards. If you need further parsing, you can create an Elasticsearch Ingest Pipeline. This will run after Filebeat sends the data to Elasticsearch but before Elasticsearch indexes it. Read more about Filebeat and Ingest Pipelines over here: Parse data using an ingest pipeline | Filebeat Reference [master] | Elastic. If you aren't able to get the desired parsing and enrichment even after that, then I'd look into adding Logstash into the picture, between Filebeat and Elasticsearch.

@shaunak Thanks for the quick feedback. I will try these options and confirm. Thanks again

Hi,
With exclude_lines and include_lines i was able to filter my logs.
All the filtered logs i am able to see in logs UI in kibana and not able to find in discover. I tried searching with filter "filebeat*", but i get "No results match your search criteria". Is there something that i am missing.??

~Neeraj

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.