Configure 2 log paths for filebeat and output 2 kafka topics and 2 indexes to display on kibana

Hi team Elastic,

I am configuring ELK according to the following flow:
Filebeat> kafka>Logstash>ES>Kibana
I need to monitor 2 logs paths on the server on kibana
I want to configure those 2 paths as 2 separate kafka topics, output elasticsearch 2 separate indexes.

filebeat.yml

filebeat.inputs:
- type: log
  paths:
    - /a/b/c_d_01*
  fields:
    log_type: c_d_01
  fields_under_root: true  
output.kafka:
    hosts: ["IP:9092"]
    topic: "app01"

filebeat.inputs:
- type: log
  paths:
    - /a/b/c_d_02*
  fields:
    log_type: c_d_02
  fields_under_root: true  
output.kafka:
    hosts: ["IP:9092"]
    topic: "app02"

logstash.conf

input {
    kafka {
            bootstrap_servers => 'IP:9092'
            topics => ["app01", "app02"]
            codec => json {}
          }
}

filter {
  # Add any necessary filters here
}

output {
  if [kafka][topic] == "app01" {
    elasticsearch {
      hosts => ["IP:9200"]
      sniffing => true
      index => "app01-%{+YYYY.MM.dd}"
      user => "elastic"
      password => "abcd@123"
      ssl => true
      ssl_certificate_verification => false
      cacert => '/u01/logstash/certssl/elasticsearch-ca.pem'
    }
  } else if [kafka][topic] == "app02" {
    elasticsearch {
      hosts => ["IP:9200"]
      sniffing => true
      index => "app02-%{+YYYY.MM.dd}"
      user => "elastic"
      password => "abcd@123"
      ssl => true
      ssl_certificate_verification => false
      cacert => '/u01/logstash/certssl/elasticsearch-ca.pem'
    }
  }
}

Please help me!

Currently I'm doing it this way
But with only 1 index, I want to separate 2 indexes to facilitate checking logs on kibana

filebeat.yml

filebeat.inputs:
- type: log
  paths:
    - /a/b/c_d_01*
- type: log
  paths:
    - /a/b/c_d_07*

output.kafka:
  hosts: ["IP:9092"]
  topic: app01-app02

logstash.conf

input {
    kafka {
            bootstrap_servers => 'IP:9092'
            topics => ["app01-app02"]
            codec => json {}
            group_id => ["app01-app02"]
            auto_offset_reset => earliest
          }
}

output {
    elasticsearch {
      hosts => ["IP:9200"]
      sniffing => true
      index => "app01-app02-%{+YYYY.MM.dd}"
      user => "elastic"
      password => "abcd@123"
      ssl => true
      ssl_certificate_verification => false
      cacert => '/u01/logstash/certssl/elasticsearch-ca.pem'
    }

Hi @carly.richmond , @tsullivan , @stephenb ,
Please help me with this problem
Or does anyone in your team know it?

Sorry for the rush, but I really need someone's help with this. Thank you everyone

Hi @Nghia_D_ng,

Thanks for your query. This forum is best effort help, and not intended as support. It can take some time to get an answer, so please don't ping team members directly unless it's been outstanding for a couple of days.

To confirm, you want to have logs from each kafka topic go to separate Elasticsearch indexes? You should be able to do that using conditional output similar to what you had in your first post.

Was there a particular issue you had with that configuration?

1 Like

HI @carly.richmond ,
When configured like the filebeat.yml file I mentioned above, the status fails. I think it's error somewhere

Ok, are there any errors in the logs? Can you share the error?

The first filebeat.yml you shared will not work, you can have only one output, you configured two, this is not supported.

You can use a conditional, the documentation has an example about how to use a conditional in the kafka output.

You may use the value that you are adding in the log_type to filter it.

Hi @leandrojmp ,
Thank you for your support
I succeed

Hi @carly.richmond ,
Thank you for your support
I succeed

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.