Arrays of JSON Log File Too Long

Hi there, I'm currently using Filebeat to listen for JSON files being sent over TCP and send them to Logstash, which then sends them to Elasticsearch. The JSON files I'm attempting to index contain fields with some long arrays (thousands of numbers long). When I attempt to send an abbreviated version of my JSON files, they upload fine and are indexed properly (by abbreviated, I mean I shortened the arrays down from their full length to only 2-5 arbitrary values). The full file, however, isn't being indexed, and no error is being thrown. Is anyone familiar with this issue and aware of a work-around?

@hosey123 When you said you see no errors in the logs have your checked both the Logstash log and the Filebeat log?

Yes I have, no errors, they're just not indexing. I also realized that the issue seems not to be due to the length of any particular array but the length of the overall file, so is there a cap on that?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.