Is it possible to do Load balancing using Kafka Input Plugin

This is Configuartion I am Using.

input {
  kafka{
    #Insert any one string from the kafka_brokers_sasl from the service credential in the Event stream.
    bootstrap_servers => "<kafka_brokers_sasl>"
    #Insert the topic name which you created in Event stream.
    topics => "<topic_name>"
    security_protocol => "SASL_SSL"
    sasl_mechanism => "PLAIN"
    #Insert the username and password from the service credential in the Event stream.
    sasl_jaas_config => "org.apache.kafka.common.security.plain.PlainLoginModule required username='<user>'  password='<password>';"
    type =>"icd_postgresql" 
    consumer_threads => 1
}

I want to use this Conf in two machine and I want my data should go to both machine but it only captured in one machine.

Can you provide more context about this? Just one machine read from Kafka? How many partitions does your Kafka topic have?

If you use this configuration in multiple logstash, they would consume data from your kafka topic in parallel, I would also recomend to explicitly setting the group_id to some name instead of just relying on the default group_id name.

I configured 1 partition for my Kafka Topic. I am using two different machines where I am using Different Broker SSL in both machines, but My data is Captured in only one machine.

GOAL: I want my data should go to both machines.

You need at least 2 partitions, with 1 partiton only one consumer will be able to consume from your topic, which is what you described.

My Kafka has 6 Broker SSL so should I have to use all of them at once in both machines, or anyone would be sufficient for both machines??

Does Broker SSL somehow affect our incoming events?

It is not clear what you are asking.

Your kafka cluster has 6 brokers? In the configuration you can put them all or just a couple of them, it makes no difference as Kafka client always get a list of available brokers when connecting.

I don't think so.

The issue is, if your topic has just 1 partition, only 1 consumer per log group will be able to consume its logs, since you have 2 logstash with the same group_id, one of them is idle, you need to increase the number of partitions according to the number of Logstash nodes you have.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.