How to do batched fetching with jdbc_streaming?

I have to use jdbc_streaming to get data from postgres database. With large number of records, logstash runs OOM. Instead of increasing heap size, I'd like to batched read as jdbc input would do.
have been trying different ways (using OFFSET and LIMIT for query and write offset to a track_offset file). Cannot get it working.
Anyone has experience about this?
TIA

The jdbc_streaming filter returns the entire result set in a single event as an array of hashes. If that doesn't fit into the available memory then it will indeed run the JVM OOM. If you want one event per DB row then currently you have to use a split filter after the jdbc_streaming filter.

It would be possible to re-architect the jdbc_streaming filter to have it use paging and do the splits itself but that would be quite a large project.

Thank you, Badger.
The jdbc_streaming query could return as many as 5m rows. I have following split script after the query.

def register(params)
    @field = params['field']
    @target = params['target']
end

def filter(event)
    data = event.get(@field)
    event.remove(@field)
    a = []
    data.each { |x|
        e = event.clone
        e.set(@target, x)
        a << e
    }
    a
end
type or paste code here