Adjust resources for a memory-heavy logstash filter

Hi,

I'm writing a custom filter that reads from a file and generates potentially huge amount of events (based on how many lines it contains), where each line is translated to a yielded event.
With default settings, it works well for a rather small files, but chokes for 100m+.
How do I increase the resources to allow it to handle large amounts of data processed in the filter?
LS_HEAP_SIZE doesn't seem to be related.

Using logstash 5.3.1.

Thanks!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.