ERR Kafka (topic=stack): dropping too large message of size 1262765

Hi,

I have been testing out filebeat 5.3 and now 5.4 but keep getting these errors and cannot find which config entry will solve it.
I have configured 2 log files which produce about 16 lines per second combined and i am not using multiline.

if i configure a single log file for input then i do not get these errors.

2017-05-09T15:04:23Z INFO Non-zero metrics in the last 30s: libbeat.kafka.call_count.PublishEvents=6 libbeat.kafka.published_and_acked_events=1489 libbeat.output.kafka.bytes_read=3266 libbeat.output.kafka.bytes_write=166811 libbeat.publisher.published_events=1489 publish.events=1490 registrar.states.update=1490 registrar.writes=6
2017-05-09T15:04:26Z ERR Kafka (topic=stack): dropping too large message of size 1006373.
2017-05-09T15:04:31Z ERR Kafka (topic=stack): dropping too large message of size 1006407.
2017-05-09T14:52:16Z ERR Kafka (topic=stack): dropping too large message of size 4220939

a message to kafka is not just the original line in your config file, but the full json-encoded event being published. If you run filebeat with -d '*', you will see the actual event to be published.

Kafka enforces a limit on event sizes. Depending on event meta-data size, you want to configure the max message size in the filebeat prospector + some delta. Also consider adding processors to remove some not required meta data from the event to be published to kafka.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.