Hi,
I know this is a known topic, I read different discussions but.. I'm still stuck, if someone could help me I will be grateful.
What I'm trying to achieve is this:
https://www.elastic.co/guide/en/logstash/current/pipeline-to-pipeline.html#distributor-pattern
My Filebeat configuration is listening to different folders, and it does react as soon as I add a file in one of them:
filebeat.inputs:
- type: log
enabled: true
paths:
- d:/mygit/all/*.log
- type: log
enabled: true
paths:
- d:/mygit/first/*.log
tags: ["first"]
- type: log
enabled: true
paths:
- d:/mygit/second/*.log
tags: ["second"]
And here is my pipelines.yml:
- pipeline.id: beats-server
config.string: |
input { beats { port => 5044 } }
output {
if "first" in [tags] {
pipeline { send_to => first}
} else if "second" in [tags] {
pipeline { send_to => second}
} else {
pipeline { send_to => all}
}
}
- pipeline.id: first
path.config: "d:/mygit/my-config-files/first.conf"
- pipeline.id: second
path.config: "d:/mygit/my-config-files/second.conf"
- pipeline.id: all
path.config: "d:/mygit/all.conf"
And my configuration files start with (same for first, second, all):
input { pipeline { address => first} }
As soon as I drag and drop a file in the filebeat's folders, I have an error on my Logstash terminal:
[2022-04-05T17:02:06,459][WARN ][org.logstash.plugins.pipeline.PipelineBus][beats-server][<long id>] Attempted to send event to 'first' but that address was unavailable. Maybe the destination pipeline is down or stopping? Will Retry.
To start Logstash I just did:
./bin/logstash
The pipeline.yml is automaticaly chose, and everything is running as usual, the last lines (before I add a file) are:
[2022-04-05T17:01:45,500][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:"beats-server"], :non_running_pipelines=>[]}
[2022-04-05T17:01:45,546][INFO ][org.logstash.beats.Server][beats-server][<long id>] Starting server on port: 5044
What am I doing wrong? Do I have to run multiple Logstash for the different pipelines? Or something else like this?
Thanks!