How to process large data with logstash

I am having a trouble making my logstash work with around million files in it.

I am seeing a max_open_file limit warning when running it.

[2017-08-10T12:26:24,844][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<Java::JavaNet::URI:0x305f9956>]}
[2017-08-10T12:26:24,947][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>1000}
[2017-08-10T12:26:25,112][INFO ][logstash.pipeline        ] Pipeline main started
[2017-08-10T12:26:25,152][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-08-10T12:28:24,112][WARN ][logstash.inputs.file     ] Reached open files limit: 4096, set by the 'max_open_files' option or default, files yet to open: 1784323
[2017-08-10T12:28:46,289][WARN ][logstash.inputs.file     ] Reached open files limit: 4096, set by the 'max_open_files' option or default, files yet to open: 1784323
[2017-08-10T12:29:59,317][WARN ][logstash.inputs.file     ] Reached open files limit: 4096, set by the 'max_open_files' option or default, files yet to open: 1784323
[2017-08-10T12:30:20,848][WARN ][logstash.inputs.file     ] Reached open files limit: 4096, set by the 'max_open_files' option or default, files yet to open: 1784323
[2017-08-10T12:30:42,944][WARN ][logstash.inputs.file     ] Reached open files limit: 4096, set by the 'max_open_files' option or default, files yet to open: 1784323

Any suggestion would help. I am not sure how to proceed with this.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.