input {
kafka{
#Insert any one string from the kafka_brokers_sasl from the service credential in the Event stream.
bootstrap_servers => "<kafka_brokers_sasl>"
#Insert the topic name which you created in Event stream.
topics => "<topic_name>"
security_protocol => "SASL_SSL"
sasl_mechanism => "PLAIN"
#Insert the username and password from the service credential in the Event stream.
sasl_jaas_config => "org.apache.kafka.common.security.plain.PlainLoginModule required username='<user>' password='<password>';"
type =>"icd_postgresql"
consumer_threads => 1
}
I want to use this Conf in two machine and I want my data should go to both machine but it only captured in one machine.
Can you provide more context about this? Just one machine read from Kafka? How many partitions does your Kafka topic have?
If you use this configuration in multiple logstash, they would consume data from your kafka topic in parallel, I would also recomend to explicitly setting the group_id to some name instead of just relying on the default group_id name.
I configured 1 partition for my Kafka Topic. I am using two different machines where I am using Different Broker SSL in both machines, but My data is Captured in only one machine.
Your kafka cluster has 6 brokers? In the configuration you can put them all or just a couple of them, it makes no difference as Kafka client always get a list of available brokers when connecting.
I don't think so.
The issue is, if your topic has just 1 partition, only 1 consumer per log group will be able to consume its logs, since you have 2 logstash with the same group_id, one of them is idle, you need to increase the number of partitions according to the number of Logstash nodes you have.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.