Logstash - In memory maps/Lookup tables?

Filebeat sends data to logstash. I’m reading files that are up to about 500mb, about 2 million lines.

On each longline I calculate certain fields based on patterns in the log file name, example, split the path, get the file name from full path, from the file name get the server name and date data for when log lines only have part of the date.

Currently I modified filebeat to calculate this data once and tag it to each event. I’d rather just use the the default filebeat binary.

Is there a way to accomplish this in logstash? I was thinking of having a temp folder, each file could be the file name_field, check if this exists if not split the string and calculate.. I’m not sure if there is a better way to do this?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.