Hello all.
I have a problem while using logstash split filter.
Logstash extract data from API using http_polled input plug-in.
and returned data contain array (format is json)
I split array data using split filter plug-in.
There is no problem when array contain small a mount of data. (about 50~60)
But there is problem when array contain large numbers of data (about 5000)
Logstash consume too much memory while split array.
And jvm hanging when logstash max memory of jvm.
I have increase jvm memory up to 128GB but still logstash need more memory.
Logstash looks like store spit data on memory before finish split filter.
Is there any option I can send event to output (elasticsearch) while split filter working?