Just thinking out loud here....
Is it possible to leverage filebeat to ingest installed rpms on any given hosts and use elastic as a means to see what vulnerabilities we have in our environment? I'm just about positive that I can but trying to think of how to piece it all together. Something like a poor mans Nessus report, if you will.
Our local host can have a daily cron write to a file that filebeat picks up and reads. I could put that into a new index and format that data into kv pairs for searching but I'm slowing down on exactly how best to achieve this.
Has anyone attempted to do something like this?
Have you considered indexing yum logs?
Personally I haven't done this, but your approach sounds reasonable. As filebeat is running as a daemon/service, it will be active, waiting for updates from the cron job and publish the new logs.
The easiest would be if your script would operate like this:
- fetch packages + versions + other metadata
- create temporary output file
- write each entry as JSON document (one line per entry + finish with newline character) to new temporary file
- fsync + close temporary file
- replace old log file with temporary file.
After step number 5, filebeat will pick up the new file and publish all entries. On failure, filebeat will retry until all contents has been indexed.
Reason for the JSON is, filebeat can already parse JSON. This way you won't need any additional processing/parsing before indexing your documents.
rpm package, a python script should just be a few lines.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.