I am looking to configure 2 sets of kafka hosts to which different logs will be sent.
Example:
filebeat.inputs: syslog
this should be sent via below hosts and without any authentication (no auth configured on server end)
output.kafka:
topic: topic-syslog
hosts:
- syslog-kafka:2222
topics:
- topic: topic-syslog
when.contains:
fileset.name: "syslog"
filebeat.inputs: filestream for maillog
this should be sent via different kafka hosts on which auth is configured. This option work if it solely configured. However I cant get it work it together with syslog.
output.kafka:
topic: topic-maillog
sasl.mechanism: PLAIN
username: "abc"
password: "xx"
hosts:
- maillog-kafka:2222
topics:
- topic: topic-maillog
when.contains:
input.id: "maillog"
Can anyone please help how I can set it up together so that syslog goes via syslog-kfka and maillog go via maillog-kafka?
In the filebeat.inputs section, we define two input sources:
type: syslog for collecting syslog messages
type: filestream for collecting maillog files from a specific path
In the output.kafka section, we specify the Kafka hosts to which the logs will be sent:
syslog-kafka:2222 for syslog messages
maillog-kafka:2222 for maillog messages
We use the topic option with the value '%{[fields.topic]}' to dynamically set the topic based on the fields.topic field.
We configure the Kafka producer settings such as compression and maximum message size.
We enable SSL for secure communication with Kafka.
We set the SASL authentication mechanism to PLAIN and provide the username and password for authentication.
In the topics section, we define two topic configurations:
For topic-syslog, we use the when.contains condition to match the fileset.name field with "syslog". We set sasl.enabled: false to disable SASL authentication for this topic.
For topic-maillog, we use the when.contains condition to match the input.id field with "maillog". We set sasl.enabled: true to enable SASL authentication for this topic.
With this configuration, Filebeat will send syslog messages to the topic-syslog topic on the syslog-kafka:2222 host without authentication, and maillog messages to the topic-maillog topic on the maillog-kafka:2222 host with SASL authentication.Make sure to adjust the paths, topic names, and authentication details according to your specific setup.
Did you test it? I don't think this works, in the hosts settings you put the hosts to where filebeat will connect when starting/restarting to get the cluster metadata, it will then use the returned response of the metadata to send the events.
If you have 2 different clusters in the hosts configuration, filebeat will not connect to both of them because it is expected for them to be part of the same cluster, so it will talke with one of them and get the metadata for this cluster, the other cluster will be completely ignored.
I don't think this is possible using just the kafka output.
But you can do that if you use the logstash output and configure your logstash pipeline to send the data to different kafka clusters according to your needs.
@leandrojmp
I believe yes we are using logstash pipelines for both of them and kafka clusters is configured. I have edited the actual hosts details and just gave an example.
In actual configurations we have got cluster of servers. Can you please help me with the configurations which is required for your suggested option.
Thanks for the reply!
Unfortunately I dont have access to the server side and I am quite new to Elastic. Can you please let me know what exactly are you after, I can ask the team about it.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.