Is it possible to leverage filebeat to ingest installed rpms on any given hosts and use elastic as a means to see what vulnerabilities we have in our environment? I'm just about positive that I can but trying to think of how to piece it all together. Something like a poor mans Nessus report, if you will.
Our local host can have a daily cron write to a file that filebeat picks up and reads. I could put that into a new index and format that data into kv pairs for searching but I'm slowing down on exactly how best to achieve this.
Personally I haven't done this, but your approach sounds reasonable. As filebeat is running as a daemon/service, it will be active, waiting for updates from the cron job and publish the new logs.
The easiest would be if your script would operate like this:
fetch packages + versions + other metadata
create temporary output file
write each entry as JSON document (one line per entry + finish with newline character) to new temporary file
fsync + close temporary file
replace old log file with temporary file.
After step number 5, filebeat will pick up the new file and publish all entries. On failure, filebeat will retry until all contents has been indexed.
Reason for the JSON is, filebeat can already parse JSON. This way you won't need any additional processing/parsing before indexing your documents.
Using the rpm package, a python script should just be a few lines.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.