Random loss of data import between filebeat and logstash

Every night, short after midnight we export data from a Mysql database into several csv files.
Filebeat runs on the same server and is configured to send these data to the elastic server also running logstash.

Logstash is configured to handle this data import and feeds elasticsearch indexes.

It usually works fine and the data is available for use in Kibana in the morning.

BUT every now and then (once or twice a week) the imported data gets truncated : only part of the data gets imported. For example only 4 lines over 50. And only some csv data files are concerned by this issue on the same day.

In the morning, if we make a local copy of the file that was only partly imported it is detected by filebeat and successfully imported by logstash. We have to do this manually for every file that was badly imported.

It looks like the error appears totally randomly.

What could be the source of this error ?
How can we solve this issue ?

Kind regards,

Anybody has an idea about it ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.