The scenario is that I need to read files, make some changes and write the content to another file (say output_file) via filebeat. Also I had another application(say app2) to do the process-and-remove thing to the output_file.
I know it is strange but I need it.
The problem is that when the filebeat is writing the output_file and the output_file is removed by app2 while writing, filebeat stops to write any more.
For example. The original_file is 100MB. FIlebeat reads the original_file and writes it to output_file. When the output_file grows to 50MB, app2 removes it. Then the remaining 50MB gets lost.
I tried with below script to generate output_file and it works well. All 10000000 records are processed.
for ((i=0; i<10000000; ++i)); do echo $i >> output_file.log; done
Please give me a hand on this. Thanks.
Best regards,
Ryan