Json message not available in the ELK-stack

Hi there,

We've ran into a problem within our ELK-stack. We have an application that logs in json format.
When there is an error the application adds an field called "stack_trace". In this field is the stack trace parsed (obviously).

We use the JSON filter plugin for Logstash. If the stack_trace field is present with an very big stack trace, the event is not send to Elasticsearch, and we also cannot find it on the persistent queue.

Is there a limit to what size a single field in the JSON message can be?
If there is a small stack trace in the field, the message is send to ES.

Or can it be that the whole message is getting to large with a big stack trace in it?

There is no error in the logging of Filebeat, Logstash, Elasticsearch regarding this problem.
We are running version 6.3.1 of the Elastic stack.

If you need more information, please ask :slight_smile:

The field is not sent, or the event is not sent? Are you saying the event is put into Elasticsearch without the over-sized stack trace?

Ah sorry, the complete event is not send to Elasticsearch.

Does anyone know why this is happening?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.