How Logstash Process data

Hello Team

I am in requirement of pushing data from Oracle Table to Elasticsearch Index.

My Oracle Table is of 300GB in Size.

I would like to understand , How Logstash process the data when he pulls from Oracle { input } and dumps into Elasticsearch Index {ouptut}.

Will he take whole 300GB of data into memory and process the output OR internally there is some buffer after that it pushes that much of intermediate data into ES Index.

Could someone please help me in understanding its behaviour.

This is required for me to proceed further with implementation.

Tushar Nemade

logstash will use a cursor for the result set, so that it fetches results in groups. That is mentioned here. Events are then processed through the pipeline and sent to the output in batches.

Thank You Badger.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.