Filebeat 5.4 module system and audit not work

im trying to install newest version of filebeat and i enabled module system + auditd.
(New template installed with ""template": "fb-*",")

ES : 5.3.2, Kibana 5.3.2 , LS 5.3.2
ingest node enabled by default!

This is myconfig

#==========================  Modules configuration ============================
filebeat.modules:

#------------------------------- System Module -------------------------------
- module: system
  # Syslog
  syslog:
    enabled: true

    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: ['var/log/secure']


#------------------------------- Auditd Module -------------------------------
- module: auditd
  log:
    enabled: true
    var.paths: ['var/log/audit/audit.log']

output.logstash:
  enabled: true
  hosts: ["LS01:5044","LS02:5044"]
  worker: 2
  compression_level: 3
  loadbalance: true

  index: 'fb-test'

When i check log in kibana, syslog + audit log not filter ..

{
  "_index": "fb-test-2017.18",
  "_type": "log",
  "_id": "AVvW1Qtgjg3FaMDlHThW",
  "_score": null,
  "_source": {
    "@timestamp": "2017-05-05T04:19:19.578Z",
    "offset": 4558324,
    "beatname": "fb-test",
    "beattype": "log",
    "@version": "1",
    "input_type": "log",
    "beat": {
      "hostname": "Nginx-LB-Inside-02",
      "name": "Nginx-LB-Inside-02",
      "version": "5.4.0"
    },
    "host": "Nginx-LB-Inside-02",
    "source": "/var/log/audit/audit.log",
    "message": "type=CRED_ACQ msg=audit(1493957956.583:4451): user pid=20104 uid=0 auid=1002 ses=720 msg='op=PAM:setcred acct=\"root\" exe=\"/usr/bin/sudo\" hostname=? addr=? terminal=/dev/pts/1 res=success'",
    "type": "log",
    "tags": [
      "beats_input_codec_plain_applied"
    ]
  },
  "fields": {
    "@timestamp": [
      1493957959578
    ]
  },
  "highlight": {
    "beat.name": [
      "@kibana-highlighted-field@Nginx-LB-Inside-02@/kibana-highlighted-field@"
    ],
    "beat.hostname": [
      "@kibana-highlighted-field@Nginx-LB-Inside-02@/kibana-highlighted-field@"
    ],
    "host": [
      "@kibana-highlighted-field@Nginx-LB-Inside-02@/kibana-highlighted-field@"
    ]
  },
  "sort": [
    1493957959578
  ]
}

Something wrong im my config ?.
Thanks

Filebeat modules rely on Elasticsearch ingest pipelines to parse and process the data. If you write directly to Elasticsearch this is automatically set up, but this is as far as I know not automatically done when sending through Logstash (as per the node on the page I linked to). It is therefore likely that your log stash config will need to change in order to direct the data to the correct ingest pipeline.

Your mean is i need change output to ES ingest node ?
So i will use grok filter in logstash because i have many file log need to parse in filebeat.prospectors config

And i run debug filebeat, i saw error log :slight_smile:

ERR Not loading modules. Module directory not found : /usr/share/filebeat/bin/module

Exactly, by default, module directory is : /usr/share/filebeat/module. (Centos)
How to change directory in config.
Thanks!

Filebeat looks for the module directory under path.home. Adjusting it in your filebeat config should help.

1 Like

I have same problem with same config.
path.home don't help...

`[root@elastic filebeat]# ./bin/filebeat -e -c /etc/filebeat/filebeat.yml -path.home /usr/share/filebeat/
2017/05/17 12:30:56.293475 beat.go:285: INFO Home path: [/usr/share/filebeat/] Config path: [/usr/share/filebeat/] Data path: [/usr/share/filebeat//data] Logs path: [/usr/share/filebeat//logs]
2017/05/17 12:30:56.293524 beat.go:186: INFO Setup Beat: filebeat; Version: 5.4.0
2017/05/17 12:30:56.293593 metrics.go:23: INFO Metrics logging every 30s
2017/05/17 12:30:56.293664 logstash.go:90: INFO Max Retries set to: 3
2017/05/17 12:30:56.293748 outputs.go:108: INFO Activated logstash as output plugin.
2017/05/17 12:30:56.293842 publish.go:295: INFO Publisher name: elastic.sbr.local
2017/05/17 12:30:56.294097 async.go:63: INFO Flush Interval set to: 1s
2017/05/17 12:30:56.294112 async.go:64: INFO Max Bulk Size set to: 2048
2017/05/17 12:30:56.315062 beat.go:221: INFO filebeat start running.
2017/05/17 12:30:56.315100 filebeat.go:81: WARN Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2017/05/17 12:30:56.315134 registrar.go:85: INFO Registry file set to: /usr/share/filebeat/data/registry
2017/05/17 12:30:56.315163 registrar.go:106: INFO Loading registrar data from /usr/share/filebeat/data/registry
2017/05/17 12:30:56.316462 registrar.go:123: INFO States Loaded from registrar: 8
2017/05/17 12:30:56.316500 crawler.go:38: INFO Loading Prospectors: 4
2017/05/17 12:30:56.316624 prospector_log.go:65: INFO Prospector with previous states loaded: 3
2017/05/17 12:30:56.316715 prospector.go:124: INFO Starting prospector of type: log; id: 17005676086519951868
2017/05/17 12:30:56.316954 prospector_log.go:65: INFO Prospector with previous states loaded: 5
2017/05/17 12:30:56.317090 prospector.go:124: INFO Starting prospector of type: log; id: 4384193151192871875
2017/05/17 12:30:56.317227 prospector_log.go:65: INFO Prospector with previous states loaded: 0
2017/05/17 12:30:56.317334 prospector.go:124: INFO Starting prospector of type: log; id: 3977614963612598612
2017/05/17 12:30:56.317464 prospector_log.go:65: INFO Prospector with previous states loaded: 0
2017/05/17 12:30:56.317534 prospector.go:124: INFO Starting prospector of type: log; id: 12958680918179246529
2017/05/17 12:30:56.317546 crawler.go:58: INFO Loading and starting Prospectors completed. Enabled prospectors: 4
2017/05/17 12:30:56.317560 registrar.go:236: INFO Starting Registrar
2017/05/17 12:30:56.317595 sync.go:41: INFO Start sending events to output
2017/05/17 12:30:56.317634 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2017/05/17 12:30:56.318264 log.go:91: INFO Harvester started for file: /var/log/secure

`

Sorry @eds because i cant help. I remove module system. I dont use Elasticsearch ingest pipelines to parse and process the data. if u want to use module in filebeat, u need to set output with "Elasticsearch". With my case i need send log to logstash, some log type need to parse.

In here, i want monitoring log audit, authen , i used Wazuh. Wazuh will be parse log automatic and send clear log to Elasticsearch cluster. :slight_smile:

If u like, i can read more infomation here

@tatdat thanks for reply and link! I can't send events to Elastic because a must parse some events and send mail alerts.

So u like me. You can use Wazuh.. it's helpful.
I have question. How do you send mail alert ? Do you use Elasalert right?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.