Hi i makeing a log anslysis system using elastic
It worked well in the test environment, but the following error occurred int the live environment.
the target log file size is 140M , and error message is more 1g
system flow
apache log --> filebeat --> kafka
filebeat (yml)
filebeat.prospectors:
- type: log
enabled: true
paths:
- /logs/apache_access/access_log.20190517
tail_files: true
ignore_older: 1m
close_inactive: 2m
clean_inactive: 15m
harvester_buffer_size: 16384
apache: true
output.kafka:
hosts: ["xxx.xxx.xxx.xxx:9092"]
topic: "apache_log"
required_acks: 1
compression: gzip
max_message_bytes: 1000000
-- error message --
kafka/log.go:53 producer/broker/0 state change to [closing] because kafka: error decoding packet: message of length 1397966893 too large or too small
please help.
Thanks in advance