Hi - I am using Filebeat 7.9 version. using filebeat, read data from log file and push to kafka topic. our Kafka use SASL_SSL with Kerberos mechanism. when I set up with below parms, it errors out.
Below is the error we get. Any help appreciated. Thanks.
DEBUG [harvester] log/log.go:107 End of file reached: E:\Logs\file.log; Backoff now.
DEBUG [kafka] kafka/client.go:277 finished kafka batch
DEBUG [kafka] kafka/client.go:291 Kafka publish failed with: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)
Kafka publish failed with: circuit breaker is open
I dont see any logs in broker end. I dont see parameter to provide JAAS conf file. I just give SSL and KRB5 conf here. Am I missing something from output.kafka parameters.
I am able to connect logstash to kafka(with kerberos). I have given JAAS conf and krb5 conf along with SSL truststore and keystore. For Filebeat, I get "client has run out of available brokers to talk to" and "circuit breaker is open" error. I assume, between filebeat and logstash, logstash might consume more memory than filebeat and so we dont want to go with logstash. Processing wise we just read logs and send to kafka. we dont do any aggregation or processing.
Hi, I am using filebeat 7.7.1 with kafka 0.10.0.0, and meet with the same problem.
This occurs when I use the kerberos configuration.
Filebeat is logging as below infinitely:
DEBUG [kafka] kafka/client.go:276 finished kafka batch
DEBUG [kafka] kafka/client.go:290 Kafka publish failed with: circuit breaker is open
INFO [publisher] pipeline/retry.go:196 retryer: send unwait-signal to consumer
INFO [publisher] pipeline/retry.go:198 done
INFO [publisher] pipeline/retry.go:173 retryer: send wait signal to consumer
INFO [publisher] pipeline/retry.go:175 done
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.