Having trouble getting my Netflow9 data to load into Elasticsearch properly. I'm feeding logs from multiple beats sources--and those are working 100% correctly. All of the other sources are creating indices like "firewall1-2017-08-01", "device2-2017-08-01", etc.
For some reason though, my Netflow9 data is creating indices with literal names like %{[@metadata][type]}-2017-08-01. Looks like I'm missing metadata, but I don't know how to add it. I tried adding "metadata => true" in the udp input section, but Logstash wouldn't even start with it there.
Any ideas?
My config is pretty simple:
input {
beats {
port => 5044
}
udp {
port => 9995
#metadata => true
codec => netflow {
versions => [9]
}
type => netflow
}
}
output {
elasticsearch {
hosts => [":9200"]
sniffing => true
manage_template => false
index => "%{[@metadata][type]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}