ERR Kafka (topic={topic}): dropping too large message of size

Im using Filebeat(5.6.3) connecting to Kafka(0.10.2.1).

However, i got an error as below in filebeat:

> 2017-11-29T17:12:28+08:00 ERR Kafka (topic=message): dropping too large message of large size 3038538.

It means i have some messages discarded while going into Kakfa because of large message size.

Then I try to fix this by my Filebeat yml file, i added a line under filebeat.prospectors:

max_bytes: 10485760

https://www.elastic.co/guide/en/beats/filebeat/5.6/configuration-filebeat-options.html#max-message-size

But not work, the error still occur.

Where is the config that i should correct for this error? Or should i amend the config file in Kafka?

Thank you.

  1. Kafka itself enforces a limit on message sizes. You will have to update the kafka brokers to allow for bigger messages.

  2. beats kafka output checks the JSON encoded event size. If the size hits the limit in the output, the event is dropped

  3. the max_bytes setting sets the log message size. The encoded event can be much bigger, due to additional fields + string escaping. That is max_bytes should be somewhat smaller then the max event size allowed by kafka and the kafka output in beats.

If you are fine with the default event limit in kafka, try reducing max_bytes somewhat more.

Thanks!
My problem was solved by 2 configurations:

  1. filebeat yml file, under "output.kafka:",

max_message_bytes:

  1. kafka server properties,

message.max.bytes=

And my logstash conf behind was also amended:

input{
kafka{
max_partition_fetch_bytes =>" "
}
}

hope this could help someone.

3 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.