Hello there !
Probably dumb question, but :
I'm asking how Winlogbeat parsing happen as I didn't use any logstash processing (grok,kv...) and didn't explicitely specify any pipeline to use
How does it works ?
My setup : Winlogbeat -> Kafka -> Logstash -> Elasticsearch
My winlogbeat logstash conf :
input {
kafka {
topics => ["winlogbeat"]
codec => json
bootstrap_servers => "kafka01:9092,kafka02:9092,kafka03:9092"
consumer_threads => 2
decorate_events => true
}
}
filter
{
mutate {
add_field => {
"[@metadata][index]" => "winlogbeat"
}
}
}
output {
elasticsearch {
hosts => ["https://elastic1:9200","https://elastic2:9200"]
manage_template => false
ilm_enabled => true
index => "%{[@metadata][index]}"
user => "elastic"
password => "changeme"
}
}
Thanks !