Plugin with in-memory dictionary for Logstash

Currently, my Logstash application reads data from a Kafka stream.
I would like to add data from a different source which is updated once a week.
To do so, I would like to have an external (Ruby or Java?) plugin which is able to load data from the new source into memory on a daily basis. (I would like to avoid querying the database everytime a Kafka message is received).
Once loaded into memory, I plan on taking adding some new fields into ES which are not obtainable from Kafka.

Is it possible to create a Logstash in-memory plugin which does what I describe above?
If not, is there a workaround?

Does the translate or jdbc_static plugins work for you?

Thanks! The jdbc_static plugin looks promising.
In my case, this is not a direct connection a DB, but rather, an API is wrapped around the DB which accepts SQL-like queries. The value returned by the API to me is data represented by a Thrift object.

So, I would need to be able 1) to query the API and 2) decode the Thrift objects with the jdbc_static plugin. Do you think this is possible?

It sounds to me like that may require a custom plugin or a script to handle the conversion.

1 Like

OK, thanks - I'll look into that.

@Christian_Dahlqvist Just to confirm, excluding the Thrift conversion specification, would the Rest API query part be supported by jdbc_static?

No, it expects to read from a database.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.