_jsonparsefailure when ingesting a lot of files

I noticed that some of my files are not being ingested properly with the "_jsonparsefailure" tag.

When I take a look at the message it received, it is indeed receiving only part of the JSON file. I suspect that when Logstash sees a lot of new files at once, part of the JSON file gets cut off, and Logstash will parse the JSON file incorrectly.

Anyone else run into this issue?

What does an example file look like? What does your configuration look like?

@magnusbaeck

> input{
>     file{
>         path => /media/sf_folder/*json
>         codec => json
>     }
> }

I think the issue is that I am using a VM. and I am ingesting files through a Shared Folder. The issue was resolved when I ingested files that is local to the VM.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.