Hello I run into this type of issue:
2016-05-27T20:05:19Z INFO Old file with new name found: /flow_base_dir/dump/flows-201605270835.dump is no /flow_base_dir/dump/flows-201605272000.dump
2016-05-27T20:05:19Z INFO Detected rename of a previously harvested file: /flow_base_dir/dump/flows-201605270835.dump -> /flow_base_dir/dump/flows-201605272000.dump
2016-05-27T20:05:19Z INFO Registry file updated. 672 states written.
2016-05-27T20:05:29Z INFO Run prospector
2016-05-27T20:05:39Z INFO Run prospector
2016-05-27T20:05:49Z INFO Run prospector
2016-05-27T20:05:59Z INFO Run prospector
2016-05-27T20:06:06Z INFO Read line error: file inactive
these dump files are generated every five minutes and they are unique. So filebeat will pick them up and send them to Kafka. I have a cron that will run every 4 hours and delete older files. I wonder if it contributes in any way.
Is there to tell filebeat to drop older files, or I guess update registry?