Multiple Filebeats from multiple sources to multiple pipelines?

Let me see if I am doing this right. I have two logs from two different servers that I want to process in Logstash then pass on to Elasticsearch. Both files contain different information. I want to use the Filebeats shipper to send both files to a single ELK instance. The first file shipper is working flawlessly, and the ELK instance is processing it perfectly. I wrote the filebeats.yml on the second so that Filebeats would send it to the ELK instance over another port, 5048, and wrote another pipeline on the ELK server so that it would accept the input on the same port.

But it doesn't seem to be working. Is this a valid setup, or should I be doing something else?

Here are the files I am using:
On server 1: /etc/filebeat/filebeat.yml

#=========================== Filebeat inputs =============================

filebeat.inputs:
- type: log
  enabled: true
- /var/log/first_important_log.log
# Exclude lines., multiline, etc.
#============================= Filebeat modules ============================
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
  #reload.period: 10s
#==================== Elasticsearch template setting ==========================
setup.template.settings:
index.number_of_shards: 1
#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["10.201.283.22:5044"]

Here is the same file from Server 2:

#=========================== Filebeat inputs =============================
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /var/log/second_important_log.log
# Exclude lines
# Include lines. A list of regular expressions to match. It exports the lines that are
### Multiline options
-There are multiline options here

#============================= Filebeat modules ===============================
filebeat.config.modules:

 path: ${path.config}/modules.d/*.yml
 reload.enabled: false
#reload.period: 10s
#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
  # Array of hosts to connect to.
  #hosts: ["localhost:9200"]
  # Protocol - either `http` (default) or `https`.
  #protocol: "https"
#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["10.201.283.22:5048"]

Then, from the ELK instance (10.201.283.22)

/etc/logstash/pipelines.yml:

# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
#   https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html

- pipeline.id: "beats"
  path.config: "/etc/logstash/conf.d/10_beats.conf"
- pipeline.id: "second_beats"
  path.config: "/etc/logstash/conf.d/12_second_beats.conf"
- pipeline.id: "files"
  path.config: "/etc/logstash/conf.d/14_test_files.conf"

/etc/logstash/conf.d/10_beats.conf

input {
  beats {
    port => 5044
    type => first_log
    index => beats_first_log
  }
}

filter {
  if [type] == "first_log"
          {
           ---code to process the first file--
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
	}
}

/etc/logstash/conf.d/second_beats.conf

input {
  beats {
    port => 5048
    host => 0.0.0.0
    id => second_log_beat
    type => "second_log"
    }
}

filter {
   if [type] == "second_log"
      {
            ---code to handle the second log--
   }
}


output {
#  stdout { codec => rubydebug }
  elasticsearch {
    index => "second_log-%{+YYYY.MM.dd}"
    hosts => ["http://0.0.0.0:9200"]
  }
}

Thanking you all beforehand

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.