Configure two sets of hosts for different logs

Hi Team,

I am looking to configure 2 sets of kafka hosts to which different logs will be sent.

Example:

filebeat.inputs: syslog
this should be sent via below hosts and without any authentication (no auth configured on server end)
output.kafka:
topic: topic-syslog
hosts:
- syslog-kafka:2222
topics:
- topic: topic-syslog
when.contains:
fileset.name: "syslog"

filebeat.inputs: filestream for maillog
this should be sent via different kafka hosts on which auth is configured. This option work if it solely configured. However I cant get it work it together with syslog.
output.kafka:
topic: topic-maillog
sasl.mechanism: PLAIN
username: "abc"
password: "xx"
hosts:
- maillog-kafka:2222
topics:
- topic: topic-maillog
when.contains:
input.id: "maillog"

Can anyone please help how I can set it up together so that syslog goes via syslog-kfka and maillog go via maillog-kafka?

@swtg
Please edit the post and format properly the configuration.

filebeat.inputs:
- type: syslog
  enabled: true

- type: filestream
  enabled: true
  paths:
    - /path/to/maillog

output.kafka:
  hosts:
    - syslog-kafka:2222
    - maillog-kafka:2222

  topic: '%{[fields.topic]}'

  producer:
    compression: gzip
    max_message_bytes: 1000000

  ssl.enabled: true

  sasl.mechanism: PLAIN
  sasl.username: "abc"
  sasl.password: "xx"

  topics:
    - topic: topic-syslog
      when.contains:
        fileset.name: "syslog"
      sasl.enabled: false

    - topic: topic-maillog
      when.contains:
        input.id: "maillog"
      sasl.enabled: true

Explanation:

  1. In the filebeat.inputs section, we define two input sources:
  • type: syslog for collecting syslog messages
  • type: filestream for collecting maillog files from a specific path
  1. In the output.kafka section, we specify the Kafka hosts to which the logs will be sent:
  • syslog-kafka:2222 for syslog messages
  • maillog-kafka:2222 for maillog messages
  1. We use the topic option with the value '%{[fields.topic]}' to dynamically set the topic based on the fields.topic field.
  2. We configure the Kafka producer settings such as compression and maximum message size.
  3. We enable SSL for secure communication with Kafka.
  4. We set the SASL authentication mechanism to PLAIN and provide the username and password for authentication.
  5. In the topics section, we define two topic configurations:
  • For topic-syslog, we use the when.contains condition to match the fileset.name field with "syslog". We set sasl.enabled: false to disable SASL authentication for this topic.
  • For topic-maillog, we use the when.contains condition to match the input.id field with "maillog". We set sasl.enabled: true to enable SASL authentication for this topic.

With this configuration, Filebeat will send syslog messages to the topic-syslog topic on the syslog-kafka:2222 host without authentication, and maillog messages to the topic-maillog topic on the maillog-kafka:2222 host with SASL authentication.Make sure to adjust the paths, topic names, and authentication details according to your specific setup.

Let me know if this worked.

Did you test it? I don't think this works, in the hosts settings you put the hosts to where filebeat will connect when starting/restarting to get the cluster metadata, it will then use the returned response of the metadata to send the events.

If you have 2 different clusters in the hosts configuration, filebeat will not connect to both of them because it is expected for them to be part of the same cluster, so it will talke with one of them and get the metadata for this cluster, the other cluster will be completely ignored.

I don't think this is possible using just the kafka output.

But you can do that if you use the logstash output and configure your logstash pipeline to send the data to different kafka clusters according to your needs.

@leandrojmp you are this. This would not work as I thought.

This solution need two instances of filebeat and logstash instance that would separate the results.

@leandrojmp
I believe yes we are using logstash pipelines for both of them and kafka clusters is configured. I have edited the actual hosts details and just gave an example.

In actual configurations we have got cluster of servers. Can you please help me with the configurations which is required for your suggested option.

@Adriann Sorry I am new to the forum and couldn't find the edit post option, however below is the correct format:

filebeat.inputs:
- type: syslog
  format: auto


filebeat.inputs:
- type: filestream
  id: maillog
  enabled: true
  paths:
    - /var/log/maillog


output.kafka:
  topic: topic-syslog
  hosts:
    - syslog-kafka:2222
  topics:
    - topic: topic-syslog
      when.contains:
        fileset.name: "syslog"


output.kafka:
  topic: topic-maillog
  sasl.mechanism: PLAIN
  username: "abc"
  password: "xx"
  hosts:
    - maillog-kafka:2222
  topics:
    - topic: topic-maillog
      when.contains:
        input.id: "maillog"

You need to share your logstash configurations.

This does not work, you can have only one output block.

Thanks for the reply!
Unfortunately I dont have access to the server side and I am quite new to Elastic. Can you please let me know what exactly are you after, I can ask the team about it.