Feeding Logstash


(Stiv Ostenberg) #1

I am part of a team that manages a large number of AWS Accounts. I wrote a program that can read the credentials for all the accounts from the Windows Credential Store. It then uses those credentials to pull data from all the accounts and present it in an easily searchable and filterable format. I keep adding more data as I find more useful information to pull from AWS.

This is useful in the immediate term as it allows me to to get a snapshot of the AWS configuration, but does not tell me useful data like "When did a change occur for instanceID XXX". I currently have an "Export to Excel" function, and I would love to add an "Export to Logstash" function. Is this practical?

This project is something that has been evolving slowly as I discover new information I need, but currently is pulling data for EC2, S3 and IAM. Has some issues with screens not updating and requiring me to hit the refresh buttons, but it is available to anybody wants to try it (and has a PC) on Github


(Magnus B├Ąck) #2

An easy and robust way would be for your program that extracts the AWS data to also emit JSON files that Logstash can read. You'll probably want to have some kind of change detection mechanism though so that you only emit events when something actually changes. A simple way of doing that could be to compute a hash of the current data before you make an update, hash the new data, and if they're different emit a JSON-formatted event into the file that Logstash monitors.


(Stiv Ostenberg) #3

SImple enough, I suppose! Would have to exclude "data last scanned" data. Will get onto that as soon as I get a little free time (trying to deploy an ELK stack on AWS Linux with Ansible. Had hoped for an easy playbook, but doesnt appear to be one). Since the last post I have added a pile of new components, including ELB checks that also pull the certs and check certificate data on them. Was starting a DNS module, but think the Logstash/JSON will be the next focus.


(system) #4