Assistance with multiple syslogs and creating multiple indexes

Hi Everyone,

I have an issue that many have probably encountered before. I have a server "DATA-MGT" that collects data from other systems, and formats that information as a custom SYSLOG file.

With FileBeats installed on the DATA-MGT, we are Sending different syslogs to our Logstash i'm trying to figure out if a index can be named after the document type.
I understand that logstash has a filter based on if [type] == "name_of_document_type"
wasn't sure if that could be applied to to output section as well, in my tests they have not resulted well.

Here is the config files i have, i'm new to YAML and this forum, please let me know if the "blockquote" is the best way to post the sample config files.

Filebeat.yml:

################################

filebeat.inputs:

  • type: log
    enabled: true
    paths:
    • C:\Temp\Logs\1*.log
      document_type: Log1
  • type: log
    enabled: true
    paths:
    • C:\Temp\Logs\2*.log
      document_type: Log2
  • type: log
    enabled: true
    paths:
    • C:\Temp\Logs\3*.log
      document_type: Log3
  • type: log
    enabled: true
    paths:
    • C:\Temp\Logs\4*.log
      document_type: Log4

filebeat.config.modules:
path: ${path.config}/modules.d/*.yml

reload.enabled: false

setup.template.settings:
index.number_of_shards: 3

output.logstash:
hosts: ["ELK-K:5044"]

processors:

  • add_host_metadata: ~
  • add_cloud_metadata: ~
    ################################

Logstash.conf

################################
input {
beats {
client_inactivity_timeout => 1200
port => 5044
}
}

filter {
if [type] == "Log1" {
csv {
separator => ","
columns => ["Item1","Item2","Item3","Item4","Item5","Item6"]
convert => {
"Item4" => "float"
"Item5" => "float"
"Item6" => "float"
}
}
date {
match => [ "Item1" , "YYYY-MM-dd'T'HH:mm:ss.SSSZZZ"]
target => "timestamp"
}
}
if [type] == "Log2" {
csv {
separator => ","
columns => ["Item1","Item2","Item3"]
}
date {
match => [ "Item1" , "YYYY-MM-dd'T'HH:mm:ss.SSSZZZ"]
target => "timestamp"
}
}
if [type] == "Log3" {
csv {
separator => ","
columns => ["Item1","Item2","Item3","Item4"]
}
date {
match => [ "Item1" , "YYYY-MM-dd'T'HH:mm:ss.SSSZZZ"]
target => "timestamp"
}
}
if [type] == "Log4" {
csv {
separator => ","
columns => ["Item1","Item2","Item3","Item4"]
convert => {
"Item4" => "integer"
}
}
date {
match => [ "Item1" , "YYYY-MM-dd'T'HH:mm:ss.SSSZZZ"]
target => "timestamp"
}
}
}

output {
if [type] == "Log1" {
elasticsearch {
hosts => ["http://ELK-E:9200"]
index => "Log1-%{+YYYY.MM.dd}"
}
}
if [type] == "Log2" {
elasticsearch {
hosts => ["http://ELK-E:9200"]
index => "Log2-%{+YYYY.MM.dd}"
}
}
if [type] == "Log3" {
elasticsearch {
hosts => ["http://ELK-E:9200"]
index => "Log3-%{+YYYY.MM.dd}"
}
}
if [type] == "Log4" {
elasticsearch {
hosts => ["http://ELK-E:9200"]
index => "Log4-%{+YYYY.MM.dd}"
}
}
}
################################

That's fine.

Not sure about the current version but in the past document_type did set the type field and then your conditional logic for both filters and outputs looks OK. I suggest adding either a file or stdout output with { codec => rubydebug } and look at what is actually on the events that arrive.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.