Hello , I am using Logstash input plugin to process many files in order to import them to ES. Config is like below:
input {
file {
path => "/path/*.csv"
start_position => “beginning”
}
filter {
csv {
}
}
output {
}
These files (files under /path direcory) are being copied by another process to the destination by automatic manner. We realized that sometimes Logstash is skipping first few lines in new files. Sometimes it is skipping first 20 rows or first 50 rows. We are sure that there is no exception on ES side (we checked ES logs). Also we took tcp dump to be %100 sure and we saw that some of entries is not in HTTP body which is sent to ES bulk Api. The only observation is that this issue is happening on heavy loads.
Any idea about what can be the reason for this would be very appreciated.
Thanks