After using pipeline to pipeline comunication in pipelines.yml, the other pipelines are not working

Hi, when I use pipeline to pipeline comunication in pipelines.yml the other pipelines in the file stop working.

if I comment those lines the other pipelines start to work.
this is my pipelines.yml, the first pipeline sends data to following four. the other are JDBC pipelines.

- pipeline.id: fork-filebeat
  path.config:  "/etc/logstash/conf.d/filebeat/fork-logs.conf"

- pipeline.id: diskfs-filebeat
  path.config:  "/etc/logstash/conf.d/filebeat/discos-fs.conf"

- pipeline.id: sap-filebeat
  path.config:  "/etc/logstash/conf.d/filebeat/sap.conf"

- pipeline.id: diskfs-filebeat-quebec
  path.config:  "/etc/logstash/conf.d/filebeat/discos-fs-quebec.conf"

- pipeline.id: diskfs-filebeat-alpha
  path.config:  "/etc/logstash/conf.d/filebeat/discos-fs-alpha.conf"

- pipeline.id: component
  path.config:  "/etc/logstash/conf.d/component.conf"

- pipeline.id: interface
  path.config:  "/etc/logstash/conf.d/interface.conf"

- pipeline.id: eventos
  path.config:  "/etc/logstash/conf.d/logstash-jdbc-eventos.conf"

- pipeline.id: cpu
  path.config:  "/etc/logstash/conf.d/logstash-jdbc-cpu.conf"

- pipeline.id: memory
  path.config:  "/etc/logstash/conf.d/logstash-jdbc-memory.conf"

- pipeline.id: ping
  path.config:  "/etc/logstash/conf.d/logstash-jdbc-ping.conf"

This is the fork-filebeat pipeline

input {
  beats {
    port => 5044
    include_codec_tag => false
  }
}

output {
  if [tags][0] == "disktag"{
       pipeline { send_to => "fs-pipe" }
  }
  if [tags][0] == "saptag"{
       pipeline { send_to => "sap-pipe" }
  }
  if [tags][0] == "disktag-quebec"{
       pipeline { send_to => "fs-pipe-quebec" }
  }
  if [tags][0] == "disktag-alpha"{
       pipeline { send_to => "fs-pipe-alpha" }
  }
}

What is wrong with my conf?

What are the configs for those pipelines?

If you have a pipeline with this output:

pipeline { send_to => "fs-pipe" }

You will need another pipeline with this output:

pipeline { address => "fs-pipe" }

So, share the files of those pipelines.

Also, does your documents have the field tags? Can you share an example of one source document? Use a stdout output to capture that message.

1 Like

Hi @leandrojmp, yes I have those inputs:

input { pipeline { address => "fs-pipe" } }

and the tags also work to direct the data to an specific pipeline and index
image

the problem is that when I put the those (pipeline to pipeline) configurations in pipelines.yml the other ones (JDBC pipelines) stop working.

The logstash logs will list the pipelines that are running and the pipelines that are not running after it loads the configuration. What does that message say?

1 Like

Hi, @Badger

 :non_running_pipelines=>[]}

the strange thing is that if I run logstash with the command

/usr/share/logstash/bin/logstash --path.settings /etc/logstash

all pipelines in pipelines.yml work....any ideas why they dont work when running logstash as service?

Maybe an issue with the user that the service runs as....

1 Like

Mmmm maybe,

if I run the service as root logstash will run as root or logstash?

if I run the command /usr/share/logstash/bin/logstash --path.settings /etc/logstash it will run as root right?

the last_run files for JDBC input has onwnership logstash:logstash and are udpated when I run the command /usr/share/logstash/bin/logstash --path.settings /etc/logstash as root.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.