Exception in pipeline worker and Java Out of Memory Errors

I would move that up to the top of the filter {}. No point in doing work if you are going to drop the packet.

There is nothing I can see in the filter that would obviously cause a memory leak. You are going to have to load up the whole 24 GB heap dump in MAT or some similar too and see what the large objects are. I suspect the majority of the memory will have accumulated in one collection.

Once MAT has loaded and parsed the dump (which may take hours) if you have to spend more than 60 seconds to figure out where the memory is being used then give up. If it not super obvious you are unlikely to find it.