I noticed something on my elastic setup.
Got elastic stack 5.4.0 on one machine for now
I'm parsing log files from different servers, which I copy to the input folder every minute with batch scripts. The input of files is about 20 per second at higher volume.
I noticed I'm receiving mapper_parsing_exception errors in logs, but after checking the log files, which carry the troubled data, I don't see any mistakes.
Furthermore, If I use the same folder, same logs, but temporary disable input of new files, all log entries are parsed without a single error. And that's about 9000 files per day with 1000 entries per file.
Why is logstash engine have problems with folders which have constant input of new files? input path is defined like "/logs/smsc/bkki_*". I was thinking about adding the Dead Letter Queues, but that would need upgrade of the stack.
Do you have any suggestions how to optimize the input?