Filebeat sending logs issue

Hello everyone,

I have a problem with the Filebeat inputs. I installed and configured Filebeat on a CentOS 7 with Prelude SIEM installed. I am using Filebeat to transmit the logs from Prelude to ELK. For the moment the "system" logs (module "system" activated) are sent correctly to my remote ELK server, but this is not the case for the "prelude-xml.log" and "prelude.log" log files.

My file "filebeat.yml" seems to be correctly configured with the correct paths to the log file:

    filebeat.inputs:

    - type: log
        - /var/log/prelude.log
        - /var/log/prelude-xml.log

The "system" module is well deactivated and the "prelude-xml.log" file seems to be loaded by Filebeat:

    mai 27 11:45:43 localhost.localdomain filebeat[3330]: 2020-05-27T11:45:43.325+0200        WARN        beater/filebeat.go:368        Filebeat is unable to load the Ingest Node pipelines for the configured modu...
    mai 27 11:45:43 localhost.localdomain filebeat[3330]: 2020-05-27T11:45:43.325+0200        INFO        crawler/crawler.go:72        Loading Inputs: 1
    mai 27 11:45:43 localhost.localdomain filebeat[3330]: 2020-05-27T11:45:43.325+0200        INFO        [monitoring]        log/log.go:118        Starting metrics logging every 30s
    mai 27 11:45:43 localhost.localdomain filebeat[3330]: 2020-05-27T11:45:43.329+0200        INFO        log/input.go:152        Configured paths: [/var/log/prelude.log /var/log/prelude-xml.log]
    mai 27 11:45:43 localhost.localdomain filebeat[3330]: 2020-05-27T11:45:43.329+0200        INFO        input/input.go:114        Starting input of type: log; ID: 15332088532770242788
    mai 27 11:45:43 localhost.localdomain filebeat[3330]: 2020-05-27T11:45:43.329+0200        INFO        crawler/crawler.go:106        Loading and starting Inputs completed. Enabled inputs: 1
    mai 27 11:45:43 localhost.localdomain filebeat[3330]: 2020-05-27T11:45:43.330+0200        INFO        cfgfile/reload.go:171        Config reloader started
    mai 27 11:45:43 localhost.localdomain filebeat[3330]: 2020-05-27T11:45:43.331+0200        INFO        cfgfile/reload.go:226        Loading of config files completed.
    mai 27 11:45:43 localhost.localdomain filebeat[3330]: 2020-05-27T11:45:43.437+0200        INFO        log/harvester.go:251        Harvester started for file: /var/log/prelude-xml.log
    mai 27 11:45:43 localhost.localdomain filebeat[3330]: 2020-05-27T11:45:43.438+0200        INFO        log/harvester.go:251        Harvester started for file: /var/log/prelude.log

I have done a lot of tests with differents configurations and despite some research on forums, I did not find a solution.

I do not think I understood very well how Filebeat works, my log file being in "XML" format, do I need an XML module for example?

Anyone has an idea :slight_smile: ? Thank you!

Pierre

hi @PierreR,

Not sure you can do this with only Filebeat.
You could use multiline in Filebeat to group all the lines from the request and response into one event.

filebeat.prospectors:
  - paths: ['/var/log/prelude-xml.log']
    multiline.pattern: ^<{the start of your line}>
    multiline.negate: true
    multiline.match: after

Then in either Logstash or Ingest Node , use grok to parse the request, response, and text about the request into separate fields.

Apply then the xml filter to parse those messages.

Thanks for your help @MarianaD :smiley: !

I will test with Filebeat Prospectors and I will let you know if it works.

Pierre

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.