Logstash - setting ElasticSearch index based on different input type

Hi ,

I have two input plugins in Logstash i.e. filebeat , and kafka topics for two different type and sources of logs.

Now i want to filter , process the logs and send to different Elasticsearch indexes . Now since I am very new to this area , not been able to find solution. My conf looks like below , can someone suggest me

Thanks-Bikash

input {
beats {
port => 5044
type => "iislogs"
}

kafka {
bootstrap_servers => "localhost:9092"
topics => "ApplicationError"
type => "applicationerror"
}
}

Filter {

if[type] == "iislogs" {
grok {
match =>["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:serviceName} ]
}

if[type] == "applicationerror" {
grok {
#...for application error parsing
}

}

Here in output i want to send this to different indexes based on input type

output {
elasticsearch {
hosts => ["localhost:9200"]
index => %{type}
}
stdout { codec => rubydebug }
}

I think you just need quotation marks around the type variable in your output, but I could be wrong. What error do you get?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.