I have a log file in below format and I'm using filebeats to push to Logstash
Currenttime="5/21/19 1:42 AM" Job="MyJob" Status="WAITING" START_TIME="" END_TIME="" AVG_TIME=146
Currenttime="5/21/19 3:00 AM" Job="MyJob" Status="Running" START_TIME="2019/05/21 02:55" END_TIME="" AVG_TIME=146
Currenttime="5/21/19 4:00 AM" Job="MyJob" Status="Running" START_TIME="2019/05/21 02:55" END_TIME="" AVG_TIME=146
Currenttime="5/21/19 4:48 AM" Job="MyJob" Status="Completed" START_TIME="2019/05/21 02:55" END_TIME="2019/05/21 04:48" AVG_TIME=146
When I add multiple lines and save into this file, it's not picking line by line. It's processing all lines at the same time parallely. Which means, the last line is processing at the beginning sometimes and finally when I index this to ElasticSearch, I'm not having the latest job details always.
Is there any way to prevent this ?