One of the entries does not get indexed

(Milind Gawde) #1

I have a huge JSON file, having multilined, nested and indented JSON which is about 28000 lines. I am trying to figure out how to parse this JSON under one message field in Kibana. But it seems that the message gets split into 57 different entries. Suggest a solution to combine them OR an alternative solution to prevent this split at Logstash itself.

(Larry Gregory) #2

Hi @Milind_Gawde,

The easiest way to get this data into a single field is to index it as a string, rather than as an object. Is your Logstash configuration parsing or otherwise interpreting the JSON data?

(Milind Gawde) #3

Thanks for the reply.
It just parses generally; does not interpret JSON. By the way is there any limit on the size of events which are parsed through Logstash? If so, then how can we change this limit?

(Larry Gregory) #4

Hey @Milind_Gawde, I've moved this to the Logstash forum, so that they can help you with your Logstash configuration. Thank you!

(Magnus Bäck) #5

What does your Logstash configuration look like?

(Milind Gawde) #6

@magnusbaeck, somehow managed to parse the data but one of the entries does not get indexed at all; which is a particular entry. Tried changing its position, yet does not get indexed. Any suggestions regarding solving this problem?

(Magnus Bäck) #7

Tried changing its position, yet does not get indexed.

What position?

Check your Logstash logs for details. It's possible that Elasticsearch rejects one of the events but then Logstash will log a message that should contain enough details to fix the problem.

(system) #8

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.