Process Logs line after the line

I have a log file in below format and I'm using filebeats to push to Logstash

Currenttime="5/21/19 1:42 AM" Job="MyJob" Status="WAITING" START_TIME="" END_TIME="" AVG_TIME=146
Currenttime="5/21/19 3:00 AM" Job="MyJob" Status="Running" START_TIME="2019/05/21 02:55" END_TIME="" AVG_TIME=146
Currenttime="5/21/19 4:00 AM" Job="MyJob" Status="Running" START_TIME="2019/05/21 02:55" END_TIME="" AVG_TIME=146
Currenttime="5/21/19 4:48 AM" Job="MyJob" Status="Completed" START_TIME="2019/05/21 02:55" END_TIME="2019/05/21 04:48" AVG_TIME=146

When I add multiple lines and save into this file, it's not picking line by line. It's processing all lines at the same time parallely. Which means, the last line is processing at the beginning sometimes and finally when I index this to ElasticSearch, I'm not having the latest job details always.

Is there any way to prevent this ?

@jsoriano,

Could you please assist me with this

Issue was with my Logstash. I have restricted the pipeline worker to 1 by adding -w 1 while starting Logstash and it solved the issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.