I need to send log files generated using Log4j on client machines to Elasticsearch installed on a server. The logs are in XML format.
Is there any other plugin, etc required, or can it just be done using Filebeat itself?
As mentioned here:
"This plugin is deprecated. It is recommended that you use filebeat to collect logs from log4j."
So, how do we collect logs from log4j using Filebeat?
I'm not aware of filebeat being able to parse XML. My recommendation would be to configure log4j to log with the JSON layout, which can easily parsed by Filebeat and directly forwarded to Elasticsearch.
If you could provide some details about your logging environment we might be able to give some more specific advice.
Thanks for your quick response. I will look into the JSON part.
However, we have an environment where logs are generated in XML format. So, if I use Logstash, maybe I could parse the XML. Is that possible? If yes, is there any documentation I can refer to? Preferably related to Log4j XML format.
I've seen a similar query here:
But I'm not clear how exactly to proceed.
Logstash can indeed parse XML fragments using the XML filter plugin. If the log format is not something you can change you could have...
- filebeat ship the log events to logstash, which parses the XML and forwards the JSON to Elasticsearch or
- logstash read the log file directly, parse the XML and forward the JSON to Elasticsearch
Using filebeat to ship the logs involves more moving parts, but it is optimized to be deployed on edge machines. This would be my recommendation to ingest XML-based logs.
I've been searching for a good example for this, but am unable to find one. As per your recommendation, I would like to use Filebeat to ship the log events to logstash, which parses the XML and sends to Elasticsearch. Could you please refer me to any such XML sample?
I'm looking for the configurations in Filebeat and Logstash.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.