Hi All,
I'm trying to send logs to Kerberized kafka from logstash. After staring a pipeline I'm getting below mentioned message which. i assume, tells me that Kafka Producer on logstash is able to communicate and authenticate with kafka brokers
logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [INFO ] 2019-09-25 11:20:38.939 [[kafka]-pipeline-manager] AbstractLogin - Successfully logged in. logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [INFO ] 2019-09-25 11:20:38.949 [kafka-kerberos-refresh-thread-user@domain] KerberosLogin - [Principal=user@domain]: TGT refresh thread started. logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [INFO ] 2019-09-25 11:20:38.951 [kafka-kerberos-refresh-thread-user@domain] KerberosLogin - [Principal=user@domain]: TGT valid starting at: 2019-09-25T11:20:38.000+0000 logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [INFO ] 2019-09-25 11:20:38.952 [kafka-kerberos-refresh-thread-user@domain] KerberosLogin - [Principal=user@domain]: TGT expires: 2019-09-25T21:20:38.000+0000 logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [INFO ] 2019-09-25 11:20:38.953 [kafka-kerberos-refresh-thread-user@domain] KerberosLogin - [Principal=user@domain]: TGT refresh sleeping until: 2019-09-25T19:29:39.778+0000 logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [INFO ] 2019-09-25 11:20:39.026 [[kafka]-pipeline-manager] AppInfoParser - Kafka version : 2.1.0 logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [INFO ] 2019-09-25 11:20:39.026 [[kafka]-pipeline-manager] AppInfoParser - Kafka commitId : eec43959745f444f logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [WARN ] 2019-09-25 11:20:39.190 [[kafka]-pipeline-manager] LazyDelegatingGauge - A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team. logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [INFO ] 2019-09-25 11:20:39.195 [[kafka]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"kafka", "pipeline.workers"=>16, "pipeline.batch.size"=>4096, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>65536, :thread=>"#<Thread:0xb37115b run>"} logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [WARN ] 2019-09-25 11:20:39.197 [[kafka]-pipeline-manager] javapipeline - CAUTION: Recommended inflight events max exceeded! Logstash will run with up to 65536 events in memory in your current configuration. If your message sizes are large this may cause instability with the default heap size. Please consider setting a non-standard heap size, changing the batch size (currently 4096), or changing the number of pipeline workers (currently 16) {:pipeline_id=>"kafka", :thread=>"#<Thread:0xb37115b run>"} logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [INFO ] 2019-09-25 11:20:41.175 [[kafka]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"kafka"} logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [INFO ] 2019-09-25 11:20:41.383 [[kafka]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [INFO ] 2019-09-25 11:20:41.472 [LogStash::Runner] agent - Pipelines running {:count=>1, :running_pipelines=>[:kafka], :non_running_pipelines=>[]}
However after some time, I'm getting errors saying AUTHENTICATION_FAILED.
logstash_kafka_knode01.1.lvkrf838mr7n@SIDCSWARM03 | [ERROR] 2019-09-25 11:29:48.242 [kafka-producer-network-thread | producer-1] NetworkClient - [Producer clientId=producer-1] Connection to node -1 (10.143.165.80/10.143.165.80:6667) failed authentication due to: An error: (java.security.PrivilegedActionException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Fail to create credential. (63) - No service creds)]) occurred when evaluating SASL token received from the Kafka Broker. Kafka Client will go to AUTHENTICATION_FAILED state.
What exactly does this error indicates? Does it means that logstash is unable to send authentication information to kafka?