Creating Multiple Logstash indexes from Filebeat logs of different machines

I am fairly new to ES and I have been testing filebeat/logstash configurations. I am trying to get Logstash to output to multiple indexes based on the logs collected from another remote machine. For example, I wanted to send Elasticsearch logs (defined by the path /var/log/elasticsearch/gc.) and regular logs (defined by path /var/log/.log) from, lets say, machine B to machine A, where Logstash, Elasticsearch, and Kibana are installed. Machine B already has Filebeat installed.

Here is the filebeat configuration of Machine B:

    # Each - is an input. Most options can be set at the input level, so
    # you can use different inputs for various configurations.
    # Below are the input specific configurations.

    - type: log

      # Change to true to enable this input configuration.
      enabled: true

      # Paths that should be crawled and fetched. Glob based paths.
      paths:
        - /var/log/*.log
    #    - /var/log/elasticsearch/gc.*
      fields: {log_type: logs8}

    - type: log
      enabled: true
      paths:
       - /var/log/elasticsearch/gc.*
      fields: {log_type: elasticsearch8}

    output.logstash:
      #enabled: true
      # The Logstash hosts
      hosts: ["<ip address of machine A>:5044"]
```````````````````````````````````````````````````````````````````````````

Here is the Logstash configuration used by Machine A:
input {
  beats {
     port => 5044
 }
}


output {
  elasticsearch {
    hosts => "http://localhost:9200"
    sniffing => true
    manage_template => true
    index => "%{[fields][log_type]}-index"
}
}

Because I wanted to test if Logstash could collect logs from different filebeats and separate the logs based on their type, I also had filebeat installed on Machine A with the same Filebeat configuration as Machine B, with the exception of the logstash output being: 

output.logstash:
  #enabled: true
  # The Logstash hosts
  hosts: ["localhost:5044"]
Running both Filebeat configurations and the Logstash configuration, Logstash was able to output 2 new indexes with the name of "logs8-index" and "elasticsearch8-index". The problem is, when I check these indexes in Kibana, it seems to only collect logs from Machine B, but not from Machine A. It only collected the regular logs and Elasticsearch logs from one machine, when I wanted logs of both machines in each index.

is there anything wrong with how I set up my configurations? Pardon my formatting, I am very new to starting a discussion thread.

By default a beats input listens on 0.0.0.0. On my TCP stack that does not include localhost. Try using "<ip address of machine A>:5044" on machine A.

If I correctly understood, you have:

  • Machine A, running

    • Logstash
    • Elasticsearch
    • Kibana
  • Machine B, running

    • Filebeat

Filebeat must be configured using the log input and you can add custom fields:

- type: log 
  paths:
    - "/var/log/*.log"
  fields:
    log_type: logs8

Filebeat output must configured to target the IP or Hostname of the Logstash machine using:

output.logstash:
  enabled: true
  hosts: ["<ip or hostname of the machine running Logstash>:5044"]

See documentation for more information.

Logstash beats input listens on 0.0.0.0, so it should not be a problem.


Some comments:

  • If you're interested into ingesting Elasticsearch logs, you might find interesting the Filebeat Elasticsearch module
  • Filebeat is capable of routing the events to different indices too. See the following example:
    output.elasticsearch:
      hosts: ["http://<ip or hostname of the host running Elasticsearch>:9200"]
      index: "%{[fields.log_type]}-index" 
    
  • We usually recommend using ILM to better manage the index rotation. Or at least ensure the index will not grow indefinitely.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.