Hello,
I'm trying to get Pipeline-to-Pipeline Communication working according to the distributor pattern in https://www.elastic.co/guide/en/logstash/current/pipeline-to-pipeline.html#distributor-pattern
(I'm working on Kubernetes / Docker)
My Pipelines.yml:
- pipeline.id: input-pipeline
path.config: "pipeline/input-pipeline.conf"
- pipeline.id: vtmd-pipeline
path.config: "pipeline/vtmd-pipeline.conf"
[...]
My input-pipeline.conf:
input {
beats {
port => 5044
host => "127.0.0.1"
type => filebeat
}
filter {
}
output {
if [type] == "filebeat" {
pipeline { send_to => vtmd-pipeline }
}
[...]
My vtmd-pipeline.conf:
input { pipeline { address => vtmd-pipeline } }
filter {
}
output {
elasticsearch {
hosts => ["http://elasticsearch-logging:9200"]
index => "%{[collector]}-%{[kubernetes][pod][name]}-%{+YYYY.MM.dd}"
}
}
I cannot see any difference to the documentation mentioned above.
I'm using logstash version 6.3.2
The container fails to start with following logs:
Sending Logstash's logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2018-08-14T08:41:24,669][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[2018-08-14T08:41:24,680][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[2018-08-14T08:41:25,336][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"82e73c15-ef1f-49fd-8744-72a433c04c87", :path=>"/usr/share/logstash/data/uuid"}
[2018-08-14T08:41:26,061][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.2"}
[2018-08-14T08:41:26,813][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:vtmd-pipeline, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, { at line 1, column 45 (byte 45) after input { pipeline { address => vtmd-pipeline ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:49:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:167:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:305:in `block in converge_state'"]}
[2018-08-14T08:41:27,343][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:input-pipeline, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, { at line 29, column 41 (byte 387) after output {\n\n\n #stdout { codec => rubydebug }\n\n #fuer Input von filebeat\n if [type] == \"filebeat\" {\n pipeline { send_to => vtmd-pipeline ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:49:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:167:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:305:in `block in converge_state'"]}
[2018-08-14T08:41:27,723][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
(The line number isn't correct in my provided snippet because I skipped the kubernetes configmap overhead)
Why is Logstash expecting an "{" after the pipeline-address and not accepting the "}" to end this block?