Logstash Cache

I'm working on an event filter written in ruby that needs to do a lookup.

I have the data stored in a JSON file that i'm loading locally and I'd like to know if there is any concept of a shared cache that I can leverage to avoid reading a file each time, parsing it in order to do a single lookup each time.

Ideally I would rather avoid adding a redis or memcache layer, though I suppose that is an option.

Any suggestions?

That sounds like you are re-implementing the translate filter. If translate is not a match for your use case you can still harvest the dictionary loading and reloading code from it.

Or else just load the file into an @instance variable in the init section of the filter.

I was actually thinking about that a second after posting it. I can just remove the code from the ruby filter and let translate take care of it.

Sorry about that, but thank you for the help.

I think the translate works a bit different since it's an actual plugin that a ruby filter that executes some arbitrary code. Still good reference to look at.

if I load the data in a @instance will it not be re-loaded again for each event? is that a one time operation?

The code in the init option of a ruby filter is one and done when the pipeline starts. The code in the code option is per-event.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.