Hi Guys,
I'm not able to create custom indexes in elastic search using logstash and filebeat.
Log flow (Filebeat ==> logstash ==> elasticsearch)
I have two custom logs files such as /var/log/app1/app1.log and /var/log/app2/app2.log.
In Filebeat I want to pass these log files along with tags, so each log file has its own tags i.e (app1 and app2).
Using these tags, I want to filter it out using logstash, so it can create two custom indexes in elastic search (app1_logs and app2_logs).
Below is my config file, please suggest to me if any changes are required.
filebeat.yml
filebeat.inputs:
-
type: log
enabled: true
paths:- var/log/app1/app1.log
fields:
tags: ["app1"]
- var/log/app1/app1.log
-
type: log
enabled: true
paths:- /var/log/app2/app2.log
fields:
tags: ["app2"]
- /var/log/app2/app2.log
demo-logstash.conf
input {
beats {
type => "log"
port => 5044
host => "0.0.0.0"
}
}
output {
if [fields][tags] == "app1" {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "app1_logs"
}
}
if [fields][tags] == "app2" {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "app2_logs"
}
}
stdout { codec => rubydebug }
}