"mapper_parsing_exception" caused by new files input

Hi,

I noticed something on my elastic setup.
Got elastic stack 5.4.0 on one machine for now

I'm parsing log files from different servers, which I copy to the input folder every minute with batch scripts. The input of files is about 20 per second at higher volume.

I noticed I'm receiving mapper_parsing_exception errors in logs, but after checking the log files, which carry the troubled data, I don't see any mistakes.

Furthermore, If I use the same folder, same logs, but temporary disable input of new files, all log entries are parsed without a single error. And that's about 9000 files per day with 1000 entries per file.

Why is logstash engine have problems with folders which have constant input of new files? input path is defined like "/logs/smsc/bkki_*". I was thinking about adding the Dead Letter Queues, but that would need upgrade of the stack.

Do you have any suggestions how to optimize the input?

Thanks!
M

UPDATE:
With further monitoring I noticed that the troubles begin with removal of files, rather that with new input. My log files have filenames which contain date string. I have a daily cronjob which once a day deletes files with filenames containing "day -1".
Soon after removal of old, already parsed files, i'm receiving mapper_parsing_exception errors in logstash log.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.