Processing large log files in logstash failing with _jsonparsefailure error

The setup is like this. We send the files from mainframe to logstash server through CFI(Tibco MFT) basically a file transfer. The log file have around 700k lines in it and it is in JSON format. This is what I think is happening when we start sending the file, logstash starts reading as soon as the file is on the logstash server, but the file is not completely transferred yet, that means it still in transit due to network latency. So logstash starts processing the it and somewhere half way through it fails with this error " _jsonparsefailure". I am really not sure its really a json error if its a json record is not completely written yet and its picked up for processing by logstash.
So we are trying to figure how to handle it. If anyone has encountered this type of issue or if anyone has a suggestion how to handle it. It would be of great help. Thank you and we appreciate your response.

The solution would have to be external to logstash. If possible, after transferring file X, also transfer a small (or even empty) file X.done, which will tell the receiver that the file transfer is complete. It can then move X to somewhere logstash will process.

Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.