Hi ,
I have two input plugins in Logstash i.e. filebeat , and kafka topics for two different type and sources of logs.
Now i want to filter , process the logs and send to different Elasticsearch indexes . Now since I am very new to this area , not been able to find solution. My conf looks like below , can someone suggest me
Thanks-Bikash
input {
beats {
port => 5044
type => "iislogs"
}
kafka {
bootstrap_servers => "localhost:9092"
topics => "ApplicationError"
type => "applicationerror"
}
}
Filter {
if[type] == "iislogs" {
grok {
match =>["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:serviceName} ]
}
if[type] == "applicationerror" {
grok {
#...for application error parsing
}
}
Here in output i want to send this to different indexes based on input type
output {
elasticsearch {
hosts => ["localhost:9200"]
index => %{type}
}
stdout { codec => rubydebug }
}