Out of Memory at Beats Input

Hi,
From time to time we get this error with our logstashes:

2025-03-19T16:13:36,640][INFO ][org.logstash.beats.BeatsHandler][beats-input] [local: 192.168.16.76:6044, remote: 192.168.15.127:52442] Handling exception:java.lang.OutOfMemoryError: Cannot reserve 67108864 bytes of direct buffer memory (allocated: 14026138464, limit: 14038335488) (caused by: java.lang.OutOfMemoryError: Cannot reserve 67108864 bytes of direct buffer memory (allocated: 14026138464, limit: 14038335488))

This error causes ingestion to stop but logstash keeps running. Is this a known issue? Version is 8.16.3.

Thx
D

Maybe. Take a look at this issue (and the unmerged PR it links to).

If I understand it correctly (which I may not) then it can happen if logstash cannot keep up with the traffic that beats are sending.

Instead of applying backpressure to slow down the senders, Netty just keeps growing direct memory buffers and runs OOM.

Other issues that link to that one suggest that adjusting pipelining in the beats client can help. And yes, I understand why that might not be a helpful suggestion.