I'm using input plugin to load CSV files into ES. Since files are created on another machine and afterward copied to local filesystem, I need to wait for file to completely be copied. We have issues with reading files (if the copy process is not finished), probably because of NFS copy - transfer stuff.
Want to create pipeline that will wait after file is created to wait 90 seconds before it starts reading it (90 seconds should be enough to complete the transfer). This is my input part of pipeline:
input{
file{
id => "import_files"
path => "/logstash/files/csv/history_oracle_*"
start_position => "beginning"
sincedb_path => "/etc/logstash/sincedb_files"
sincedb_clean_after => 4
file_completed_action => "log"
file_completed_log_path => "/etc/logstash/completed_files"
stat_interval => "1000 ms"
discover_interval => 90
}
}
As far as I get it, discover_interval * stat_interval => 90 seconds
, but I get errors again in the almost same time like time when file was created. How can I force logstash to wait after it sees the file to actually read from it in N seconds.