Currently, my Logstash application reads data from a Kafka stream.
I would like to add data from a different source which is updated once a week.
To do so, I would like to have an external (Ruby or Java?) plugin which is able to load data from the new source into memory on a daily basis. (I would like to avoid querying the database everytime a Kafka message is received).
Once loaded into memory, I plan on taking adding some new fields into ES which are not obtainable from Kafka.
Is it possible to create a Logstash in-memory plugin which does what I describe above?
If not, is there a workaround?