One filter file for multiple filter plugins for different patterns

I am trying to process 2 files, each with data of different patterns.
I have 2 individual filter configuration files which are working fine when parsing each respective file.
I need help in consolidating one filter configuration file for logstash which should be able to parse all these 2 different pattern log files.
Source is filebeat.
Condition of filter plugins is path of the file on each server. e.g. /var/log/type1/*.log.

I am thinking to use a configuration

input {
beats {
port => 5044
}
}

filter {
if [path] == "custom/log/type1/*.csv"
{
csv {
autodetect_column_names => true
separator => ","
}
mutate { add_field => {"received _at" => "%{@timestamp}"}}

}

elseif [path] == "custom/log/type2/.*log"
{
  grok {
    match => { "message" => "%{TIME:timestamp}%{SPACE}%{WORD:Log-Level}%{SPACE}%{NOTSPACE:Java-class}:%{NUMBER:line-no}%{SPACE}-%{SPACE}%{GREEDYDATA:Log-Message}"
  }


}

}

output {
elasticsearch {
hosts => "...:9200"
sniffing => true
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug}
}

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.