I am using kafka input plugin and logs are getting filled with the below msgs
[2020-01-09T17:28:16,750][WARN ][org.apache.kafka.clients.NetworkClient] [Consumer clientId=logstash-0, groupId=logstash] Error while fetching metadata with correlation id 1667 : {devglan-log-test1=UNKNOWN_TOPIC_OR_PARTITION}
[2020-01-09T17:28:16,854][WARN ][org.apache.kafka.clients.NetworkClient] [Consumer clientId=logstash-0, groupId=logstash] Error while fetching metadata with correlation id 1668 : {devglan-log-test1=UNKNOWN_TOPIC_OR_PARTITION}
I tried to set the log level of kaka input plugin to ERROR as below
logger.kafkainput.name = logstash.inputs.kafka
logger.kafkainput.level = error
but it didnt help.
(base) C02VN29BHTD8:bin apple$ curl -XGET 'localhost:9600/_node/logging?pretty'
{
"host" : "C02VN29BHTD8",
"version" : "6.2.4",
"http_address" : "127.0.0.1:9600",
"id" : "90f1ffa9-bf26-43a9-8b97-1f6edefbc41e",
"name" : "C02VN29BHTD8",
"loggers" : {
"logstash.agent" : "INFO",
"logstash.api.service" : "INFO",
"logstash.codecs.plain" : "INFO",
"logstash.codecs.rubydebug" : "INFO",
"logstash.config.source.local.configpathloader" : "INFO",
"logstash.config.source.multilocal" : "INFO",
"logstash.config.sourceloader" : "INFO",
"logstash.inputs.kafka" : "ERROR",
"logstash.instrument.periodicpoller.deadletterqueue" : "INFO",
"logstash.instrument.periodicpoller.jvm" : "INFO",
"logstash.instrument.periodicpoller.os" : "INFO",
"logstash.instrument.periodicpoller.persistentqueue" : "INFO",
"logstash.modules.scaffold" : "INFO",
"logstash.outputs.stdout" : "INFO",
"logstash.pipeline" : "INFO",
"logstash.plugins.registry" : "INFO",
"logstash.runner" : "INFO",
"org.apache.kafka.clients.ClientUtils" : "INFO",
"org.apache.kafka.clients.CommonClientConfigs" : "INFO",
"org.apache.kafka.clients.Metadata" : "INFO",
"org.apache.kafka.clients.NetworkClient" : "INFO",
"org.apache.kafka.clients.consumer.ConsumerConfig" : "INFO",
"org.apache.kafka.clients.consumer.KafkaConsumer" : "INFO",
"org.apache.kafka.clients.consumer.internals.AbstractCoordinator" : "INFO",
"org.apache.kafka.clients.consumer.internals.AbstractCoordinator$HeartbeatThread" : "INFO",
"org.apache.kafka.clients.consumer.internals.AbstractPartitionAssignor" : "INFO",
"org.apache.kafka.clients.consumer.internals.ConsumerCoordinator" : "INFO",
"org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient" : "INFO",
"org.apache.kafka.clients.consumer.internals.Fetcher" : "INFO",
"org.apache.kafka.common.metrics.JmxReporter" : "INFO",
"org.apache.kafka.common.metrics.Metrics" : "INFO",
"org.apache.kafka.common.network.NetworkReceive" : "INFO",
"org.apache.kafka.common.network.PlaintextChannelBuilder" : "INFO",
"org.apache.kafka.common.network.Selector" : "INFO",
"org.apache.kafka.common.protocol.Errors" : "INFO",
"org.apache.kafka.common.requests.DeleteAclsResponse" : "INFO",
"org.apache.kafka.common.utils.AppInfoParser" : "INFO",
"org.apache.kafka.common.utils.Utils" : "INFO",
"org.logstash.Logstash" : "INFO",
"org.logstash.common.DeadLetterQueueFactory" : "INFO",
"org.logstash.common.io.DeadLetterQueueWriter" : "INFO",
"org.logstash.config.ir.CompiledPipeline" : "INFO",
"org.logstash.instrument.metrics.gauge.LazyDelegatingGauge" : "INFO",
"org.logstash.secret.store.SecretStoreFactory" : "INFO",
"slowlog.logstash.codecs.plain" : "TRACE",
"slowlog.logstash.codecs.rubydebug" : "TRACE",
"slowlog.logstash.inputs.kafka" : "TRACE",
"slowlog.logstash.outputs.stdout" : "TRACE"
}
}
Unless i set the level of "org.apache.kafka.clients.NetworkClient" to "ERROR" it doesn't work.
My question is how to change the log level of org.apache.kafka.clients.NetworkClient in log4j.properties