Loading a json file to enrich data

Hello everyone,

I am working with Logstash UDP Input Plugin and Elasticsearch output plugin.

I need to load a json file containing information that I need to enrich the data sent to elastic matching same criteria.

What's the way to read a json file to load this data in memory?

I already use jdbc_static for enriching some data and I need also data coming from a json file.

If source/input is only UDP, you use the json codec or Json_lines for the multiline. In case of file on the disk, you should use the file input plugin.

If json is inside fields, you have to parse, and the most likely you will need the json filter.
Not sure how you will get it, like a single JSON in the message or multiline.

Mmm, @Rios , I don't need to load the json data contained in the json file (yes, it's just one file and it's on the disk) as an event itself, I need to use that data for "enriching" my UDP received event, as we do with jdbc_static.

The file contains a list of KEY / VALUE that will be used to enrich the UDP received data.

What's the way to read a json file at the pipeline ingestion start and have this data in memory?

You might be able to use a translate filter to do a lookup.

1 Like

I don't know how the message looks in the filter, key/value then might use KV filter, might, not sure.
Take into account, also Budger advice.

1 Like