input {
file {
path => [
"${folder}\log1.txt",
"${folder}\log2.txt"
...
...
...
"${folder}\log1000.txt"
]
}
}
Is there any problem with this kind of configuration to have logstash put 100+ file locations as input paths? Should I be reducing the pipeline.batch.size or something else to reduce load? My assumption is that logstash wouldn't try to open every single file in the input and parse all of them with the same batch size at the same time since that would obviously mess up logstash or elastic, but rather parse a limited amount at a time.
Context: Logstash keeps encountering a "retryable error" (which many other people on the forum have but with no proper answers), so I'm trying to find the root cause. This is massively slowing down the parsing speed and might force us off Elastic.