Single log entry being divided in ELK stack


I have established the pipeline properly and the logs are sent via TCP to Filebeat and then it is being processed and passed further down the pipeline. But the issue is that Elasticsearch and Kibana dashboard treats that as separate logs instead of a complete log object. The log entry is not treated as a single entity which should be the normal behavior. I am attaching screenshots for better understanding.

I am new to the ELK stack, so would highly appreciate if any help is provided.

Bhavesh Vasnani

Look at the multiline matching functionality in filebeat -

I have tried this but this is not the soluion to my problem. My logs are not multiple lines but the resultant log entry created is of multiple lines and is not treated as an object but a bunch of log lines which is undesirable.

The reason it is showing like that is because that is how Elasticsearch is receiving them. So it must be how filebeat is processing them.

Is there a way I can see what filebeat is recieving and what is passed to every level of the pieline?

This should do what you need to see the filebeat output:

Enabling Debugging in Filebeat

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.