Filebeat to analyze xml logs

Hello Its Suman Ghorashine

I am new to wazuh and ELK stack. And I wanted to know some certain things.

I have a xml log format set to wazuh server for analysis from agent configuration file. But when filtering the logs from location filter in Wazuh I am getting nothing it says empty and try different time format which I did with no luck.

Any help on this matter would be greatly appreciated.

We are using Filebeat, Elasticsearch, and Kibana stack.

Hi!
I'm not sure I understand the set up you're using. If you're having issues within Wazuh (the standalone app or the Kibana plugin) you should direct your question to the developers of that software.

If you're having issues within Kibana or with your data management in Elasticsearch for the data collected with Filebeat then please explain what issue you're facing within the Elastic stack!

Are you able to view your logs using Discover for example?

Thank for you response.

I have configured my wazuh agent from /var/ossec/etc/ossec.conf like

  <localfile>
    <log_format>syslog</log_format>
    <location>/var/log/izt/QRIF/qrif.log</location>
  </localfile>

The qrif.log is in xml format and I want to view this log on my wazuh dashboard.

These logs are processed by Wazuh Manager which is run on top of analytics engine Elasticsearch and visualizes those data in Kibana.

I am not getting any logs on Discover.

I see.

Can you share your Filebeat config as well for how you collect the logs from qrif.log?
Do you have any errors in your console log from Filebeat?
In Kibana, withing Stack Management -> Index management, can you see any filebeat-* indices?

Let's first establish that you're actually getting logs into Elasticsearch and go from there!

Here is my filebeat.yml file..

# Wazuh - Filebeat configuration file
filebeat.modules:
  - module: wazuh
    alerts:
      enabled: true
    archives:
      enabled: false

setup.template.json.enabled: true
setup.template.json.path: '/etc/filebeat/wazuh-template.json'
setup.template.json.name: 'wazuh'
setup.template.overwrite: true
setup.ilm.enabled: false

output.elasticsearch.hosts: ['127.0.0.1:9200']
output.elasticsearch.protocol: https
output.elasticsearch.ssl.certificate: "/etc/filebeat/certs/wazuh-manager.crt"
output.elasticsearch.ssl.key: "/etc/filebeat/certs/wazuh-manager.key"
output.elasticsearch.ssl.certificate_authorities: ["/etc/filebeat/certs/ca/ca.crt"]

output.elasticsearch.username: "<user>"
output.elasticsearch.password: "<password>"


filebeat.config.modules:
  enabled: true
  path: ${path.config}/modules.d/*.yml

Where should I paste the location of the log to be monitored for my scenario. I only have added the log location on /var/ossec/etc/ossec.conf on agent side.

We are not getting our logs of qrif.log anywhere. Can you please tell me where will the logs be sent based on following configuration on ossec.conf from /var/ossec/etc/ossec.conf on agent side.

 <localfile>
    <log_format>syslog</log_format>
    <location>/var/log/izt/QRIF/qrif.log</location>
  </localfile>

Not any issue I see on filebeat log.

Also, we have filebeat-* Stack Management -> Index management

On our documentation I found that our wazuh agent sends all the logs sent to the wazuh manger and from that we send the logs to elasticsearch and kibana.

Your issue seems to be with your Wazuh configuration, Wazuh is not supported here, you will need to contact the Wazuh developers on their github.

If your qrif.log is a multi-line xml file, you cannot use syslog as the log_format in your ossec.conf.

Wazuh can read multi-line files, but I'm not sure how you will need to configure it, you need to check the Wazuh documentation and ask the Wazuh developers.

Yes qrif.log in xml is a multi-line file.

Does that mean I have to change my wazuh manager configuration to check the multi-line files.

But I am not getting any kind of logs in even wazuh manager of any sort including qrif.log if I try to filter from location: /var/log/izt/QRIF/qrif.log on KQL filter.

As mentioned earlier, Wazuh is not supported here, you need to contact the Wazuh developers if you are having any issues with Wazuh.

If you can see other logs from Wazuh in your Elasticsearch, then there is no issue in Filebeat, it will just read the Wazuh logs.

If the log you want is not present in the Wazuh logs, then your issue is with Wazuh and you need to contact the community/developers to understand how to solve it.

Thank you so much for your help.

I will get back to you about the update I do.

Again Thank you so much

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.