I am completely new to Elastic so apologies if this is trivial.
I have managed to set up the ELK stack (self managed, all on same host machine) and have got filebeat's apache module enable sending logs to logstash. I use the following pipeline as per the examples to parse them:
input {
beats {
port => 5044
}
}
output {
if [@metadata][pipeline] {
elasticsearch {
hosts => ["http://172.17.17.50:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
pipeline => "%{[@metadata][pipeline]}"
}
} else {
elasticsearch {
hosts => ["http://172.17.17.50:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
}
I now want to use filebeat to read apache style logs from a node/express application. I have set up an additional beat in filebeat.yml with basic settings:
- type: log
enabled: true
paths:
- /var/log/supervisor/express-api*.log
This seems to work as the log lines are making it to kibana unfiltered. (i.e. the message field contains the entire log line)
As I understand it, I can only have a single pipeline listening on a single port. So the point I am stuck at is: how do I set up logstash to filter these logs whilst still having the aforementioned apache pipeline also running?
Thanks in advance,
P