Hi, I have a simple Logstash pipeline that reads all docs from an ES index using an ES input plugin, then I have a filter that splits one doc into several, and finally the output plugin to index docs in the ES.
My problem with the ES input plugin. It looks like it doesn't read docs by batches, but tries to fetch all documents into the in-memory queue. The reason why I think this Is the case is that my pipeline fails with the OOM exception and debug logs to stdout that I've added to the output section are not printing events. But if I stop the pipeline before it crashes I see that queue gets emptied and all events that were in the queue are printed and saved to a new index.
I am looking for a suggestion on how to extract nested docs to separate documents. Maybe I'm missing some point in Logstash that will read docs from ES by batches, or maybe someone will suggest another way.