I have a huge JSON file, having multilined, nested and indented JSON which is about 28000 lines. I am trying to figure out how to parse this JSON under one message field in Kibana. But it seems that the message gets split into 57 different entries. Suggest a solution to combine them OR an alternative solution to prevent this split at Logstash itself.
The easiest way to get this data into a single field is to index it as a string, rather than as an object. Is your Logstash configuration parsing or otherwise interpreting the JSON data?
Thanks for the reply.
It just parses generally; does not interpret JSON. By the way is there any limit on the size of events which are parsed through Logstash? If so, then how can we change this limit?
Hey @Milind_Gawde, I've moved this to the Logstash forum, so that they can help you with your Logstash configuration. Thank you!
What does your Logstash configuration look like?
@magnusbaeck, somehow managed to parse the data but one of the entries does not get indexed at all; which is a particular entry. Tried changing its position, yet does not get indexed. Any suggestions regarding solving this problem?
Tried changing its position, yet does not get indexed.
Check your Logstash logs for details. It's possible that Elasticsearch rejects one of the events but then Logstash will log a message that should contain enough details to fix the problem.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.