Message of length too large or too small

Hi i makeing a log anslysis system using elastic
It worked well in the test environment, but the following error occurred int the live environment.
the target log file size is 140M , and error message is more 1g

system flow
apache log --> filebeat --> kafka

filebeat (yml)
filebeat.prospectors:

  • type: log
    enabled: true
    paths:
    - /logs/apache_access/access_log.20190517
    tail_files: true
    ignore_older: 1m
    close_inactive: 2m
    clean_inactive: 15m
    harvester_buffer_size: 16384
    apache: true
    output.kafka:
    hosts: ["xxx.xxx.xxx.xxx:9092"]
    topic: "apache_log"
    required_acks: 1
    compression: gzip
    max_message_bytes: 1000000

-- error message --
kafka/log.go:53 producer/broker/0 state change to [closing] because kafka: error decoding packet: message of length 1397966893 too large or too small

please help.
Thanks in advance

This is weird looking at your configuration I see max_message_bytes which should drop any events bigger than 1000000 and 1397966893 is a lot bigger than that.

Is the watched log file have each log statement on a new line?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.