Hi all,
I use Elastic Stack to centralize logs from Java web applications. Logs are based on Sfl4j+Logback so I've used Logback JSON encoder. I've setup two possible solutions:
LogstashTcpSocketAppender: messages are directly shipped to Logstash (without Filebeat)
FileAppender to output JSON structured messages: Filebeat parses json log files and sends them to Logstash. In this case, Logstash needs an aditional filter to parse json objects.
The first option seems to be simpler but the second option allows offline parsing. What's the best practice in this scenario?
I suggest you don't make your application dependent on shipping stuff over the network for logging. Keep things simple and let the application just write the logs to disk.
Filebeat parses json log files and sends them to Logstash. In this case, Logstash needs an aditional filter to parse json objects.
Or you let Filebeat parse the JSON. Or let Logstash do it via a json codec in the beats input.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.