filebeat read log info from file, then kafka, logstash consume log from kafka and write elasticsearch.
But sometimes, in logstash ,there are half-baked log (a section of a log event), so grok parse failure;
sometimes, log is messy code;
why? someone help??
It's completely impossible to help without additional details. What do you the broken log entries look like? What does your configuration look like?
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.