Specifing the size of data to load to memory

I have an xml file , and i want to parse it to logstash. The issue is that when i have a large xml file (1Go) , the memory crashes because , the logstash try to load the whole xml file in memory . Is there a way in logstash that recognize adding a specify size of data to memory , and after sending data to elasticsearch , the specified size of data will be deleted , and after that logstash loads the same next size of data to memory?

Thank for your help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.