Many files in input path

input {
file {
path => [
"${folder}\log1.txt",
"${folder}\log2.txt"
...
...
...
"${folder}\log1000.txt"
]
}
}

Is there any problem with this kind of configuration to have logstash put 100+ file locations as input paths? Should I be reducing the pipeline.batch.size or something else to reduce load? My assumption is that logstash wouldn't try to open every single file in the input and parse all of them with the same batch size at the same time since that would obviously mess up logstash or elastic, but rather parse a limited amount at a time.

Context: Logstash keeps encountering a "retryable error" (which many other people on the forum have but with no proper answers), so I'm trying to find the root cause. This is massively slowing down the parsing speed and might force us off Elastic.

Exactly what error are you getting?

Attached is a screenshot of the error. It seems like an error complained by many users, without any proper response to it.

At least 20-50% of the time logstash is just stuck on waiting to retry.

FYI see Logstash constantly losing connection to Elastic for how I solved it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.