Here is my filebeat config:
filebeat.prospectors:
- type: log
paths:
- C:/inetpub/app01/logs/IIS/*/*.log
fields:
app_group: iis
app_id: app01
- type: log
paths:
- C:/inetpub/app02/logs/IIS/*/*.log
fields:
app_group: iis
app_id: app02
- type: log
paths:
- C:/inetpub/app03/logs/IIS/*/*.log
fields:
app_group: iis
app_id: app03
output:
logstash:
hosts: ["myserver:7777"]
And here is logstash:
input {
beats {
port => 7777
}
}
filter {
if [fields][app_group] == "iis" {
.... do my parsing ...
}
}
output {
if [fields][app_group] == "iis" {
elasticsearch {
hosts => "mycluster"
index => "logstash-iis-%{[fields][app_id]}-%{+YYYY-MM-dd}"
}
} else {
elasticsearch {
hosts => "mycluster"
index => "uncategorized"
}
}
}
They are all getting pushed into my "uncategorized" index which means the fields aren't getting applied or if [fields][app_group] is wrong or something. I can't find any example with setting multiple fields and having logstash access them