Hello, I'm new to the stack. I have a bunch of logs I wanted to try out, I was only able to get auditd module working (commented out below)
- module: system
# - module: auditd
# enabled: true
# var.paths: ["/media/sf_Work/elastic/project-data/n1/ec2/log/audit/*"]
I would run the following command for entry:
filebeat -e -v -c ./filebeat.yml -strict.perms=false
and place logs where they should be.
I don't get any errors but I wouldn't get any system module related docs. (again, auditd works)
Do I have to write some sort of custom parser or am I missing something?
When you run Filebeat, you should see an info message like:
2018-03-19T18:11:08.948-0700 INFO log/harvester.go:216 Harvester started for file:...
If you don't, make sure you specify the correct glob pattern to find the files. For example:
See the Filebeat docs about the
paths option for more about supported glob patterns.
If that doesn't solve your problem, please format your config example as code (wrap in triple backticks) and also provide the Filebeat version and the console output that you see when you run Filebeat with the -e option specified.
Thank you so much for replying.
|2018-03-19T18:38:31.168-0700|INFO|log/prospector.go:111|Configured paths: [/var/log/auth.log* /var/log/secure*]|
|2018-03-19T18:38:31.174-0700|INFO|log/harvester.go:216|Harvester started for file: /var/log/auth.log|
|2018-03-19T18:38:31.183-0700|INFO|log/prospector.go:111|Configured paths: [/media/sf_Work/elastic/project-data/n1/ec2/log/secure/*]|
|2018-03-19T18:38:31.196-0700|INFO|log/prospector.go:111|Configured paths: [/media/sf_Work/elastic/project-data/n1/ec2/log/audit/*]|
|2018-03-19T18:38:31.196-0700|INFO|crawler/crawler.go:82|Loading and starting Prospectors completed. Enabled prospectors: 3|
|2018-03-19T18:38:31.244-0700|INFO|log/harvester.go:216|Harvester started for file: /media/sf_Work/elastic/project-data/n1/ec2/log/audit/audit.log.1|
|2018-03-19T18:38:31.286-0700|INFO|log/harvester.go:216|Harvester started for file: /media/sf_Work/elastic/project-data/n1/ec2/log/audit/audit.log.2|
|2018-03-19T18:38:31.294-0700|INFO|log/harvester.go:216|Harvester started for file: /media/sf_Work/elastic/project-data/n1/ec2/log/audit/audit.log.3|
|2018-03-19T18:38:31.339-0700|INFO|log/harvester.go:216|Harvester started for file: /media/sf_Work/elastic/project-data/n1/ec2/log/audit/audit.log|
|2018-03-19T18:38:31.380-0700|INFO|elasticsearch/client.go:690|Connected to Elasticsearch version 6.2.2|
|2018-03-19T18:38:31.405-0700|INFO|template/load.go:73|Template already exists and will not be overwritten.|
|2018-03-19T18:38:51.388-0700|INFO|log/harvester.go:216|Harvester started for file: /media/sf_Work/elastic/project-data/n1/ec2/log/secure/secure|
|2018-03-19T18:38:51.477-0700|INFO|log/harvester.go:216|Harvester started for file: /media/sf_Work/elastic/project-data/n1/ec2/log/secure/secure-20171112|
|2018-03-19T18:38:51.907-0700|INFO|log/harvester.go:216|Harvester started for file: /media/sf_Work/elastic/project-data/n1/ec2/log/secure/secure-20171119|
|2018-03-19T18:38:52.123-0700|INFO|log/harvester.go:216|Harvester started for file: /media/sf_Work/elastic/project-data/n1/ec2/log/secure/secure-20171126|
|2018-03-19T18:38:52.259-0700|INFO|log/harvester.go:216|Harvester started for file: /media/sf_Work/elastic/project-data/n1/ec2/log/secure/secure-20171203|
Not same as the original config but it seems alright. Although I don't understand why /var/log/auth.log is being harvested because I have no config for that.
Nov 6 03:00:42 ip-172-31-28-22 sshd: Accepted publickey for ec2-user from 192.168.11.14 port 39638 ssh2: RSA SHA256:zJaK017NHbZJCdAFdN3YEOFXOvP+gT3edI+rO00xIPs
This is part of secure log and I realized it doesn't have year. Could that be a potential issue?
Is it possible that you are running Filebeat with the system module enabled in both the filebeat.yml and modules.d directory? The default config in the modules.d directory reads /var/log/auth.log. Run
filebeat modules list to see the list of enabled modules. See the Filebeat docs for more about enabling modules.
Regarding your question about that year: if the year isn't specified in the log file, Filebeat sets it to the current year automatically.
Thank you so much for helping out. You were right about the enabled
modules. However I ended up using logstash, groking files directly and got
what I wanted
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.