Pipelines.yml pipeline to pipeline address unavailable error

Logstash version 6.6.1

Logstash command:

/usr/share/logstash/bin/logstash  --config.reload.automatic --path.settings=/etc/logstash

I am attempting to handle multiple pipelines using the distributor pattern in my pipelines.yml. I am sending a log file from Filebeats (6.6.1), which works fine if I don't try to use the pipeline to pipeline. But if I use the send_to for the pipeline I keep receiving an error:

[2019-03-04T15:20:18,898][WARN ][org.logstash.plugins.pipeline.PipelineBus] Attempted to send event to '/etc/logstash/conf.d/Process.conf' but that address was unavailable. Maybe the destination pipeline is down or stopping? Will Retry.

I have attempted multiple iterations of the Process.conf syntax, I have done the full path, I have tried Process, and Process.conf, I have removed the brackets, I have tried the config.string with just the |, I have done the double quotes, single quotes, but all of them come back with the same error.

Here is my pipelines.yml file:

- pipeline.id: processlog 
  config.string: "
    input { 
      beats { 
        port => 5044 
      } 
    }
    output {
          pipeline {
            send_to => '/etc/logstash/conf.d/Process.conf'
          }
    }
  "

Eventually I would like to use conditional statements, but I can't get logstash to process what I thought would be a simple pass through.

If I remove the "Output" section in my pipelines.yml and add the path.config then everything works without issue.

- pipeline.id: processlog 
  config.string: "
    input { 
      beats { 
        port => 5044
      } 
    }
  "
  path.config: "/etc/logstash/conf.d/Process.conf"

My end goal is to be able to only have one filebeat / winlogbeat service running on each of my servers and then to send different types of log files to be processed in different ways on my logstash server.

Do you have have any suggestions on where I might learn to resolve this, or possibly how to fix it?

If you have a pipeline+send_to like this

pipeline { send_to => '/etc/logstash/conf.d/Process.conf' }

then you must have a pipeline+address that matches it. If you want a single pipeline that does a passthrough to another pipeline I think you want something like

- pipeline.id: processlog
    config.string: "
        input { beats { port => 5044 } }
        output { pipeline { send_to => 'default' } }
     "
- pipeline.id: processeverything
    config.path: '/etc/logstash/conf.d/Process.conf'

Then in Process.conf you would need to include "input { pipeline { address => 'default' } }"

Thank you for your quick response on this.
Alright, I kind of follow what needs to be done. I completely missed the fact that my send_to had to be a virtual address.

So I have changed my pipelines.yml to look like this:

- pipeline.id: processlog
  config.string: "
    input { 
      beats { 
        port => 5044
      } 
    }
    output {
          pipeline {
            send_to => 'default'
          }
    }
  "
- pipeline.id: processeverything
  path.config: "/etc/logstash/conf.d/Process.conf"

I then added to the top of my Process.conf this:

input {
  pipeline {
    address => 'default'
  }
}

Now I only occasionally see the same error message. But I have the problem where logstash isn't processing the log files.

So I removed the:

- pipeline.id: processeverything
    config.path: '/etc/logstash/conf.d/Process.conf'

And I continually receive the error message again.

My assumption is that logstash isn't attempting to check /etc/logstash/conf.d/Process.conf for the virtual address of default. How would logstash know to check there? Along those same lines, how would I go about specifying different pipelines with different virtual addresses?
The reason that I ask, is I don't know how I would assign multiple pipelines that way.
So if I had my Process.conf pipeline, I would also want to spin up an Apache.conf pipeline, and probably a Windows.conf pipeline as well.

Once again thank you for your suggestion. I will come in tomorrow and start playing with this some more to try and figure out what I'm missing.

It's not that it knows to look in that file. It knows to read and process that file when creating the processeverything pipeline. That will result in it creating a pipeline that reads 'default'. When it creates the processlog pipeline and it creates the pipeline output it will find that there is a matching input.

Thank you @Badger, I now understand how the pipelines are supposed to work using the virtual addresses.

I have now been able to get a couple of test pipelines working without any issues.

Here is my new pipelines.yml

 - pipeline.id: filebeats
  config.string: |
    input { 
      beats { 
        port => 5044 
      } 
    }
    output {
      if "Process" in  [fields] {
          pipeline {
            send_to => Process
          }
      }
      else if "dmesg" in [fields] {
        pipeline {
          send_to => dmesg_log
        }
      }
    }
  
- pipeline.id: process
  path.config: "/etc/logstash/conf.d/process.conf"
- pipeline.id: dmesg
  path.config: "/etc/logstash/conf.d/dmesg.conf"

Here is my process.conf

input {
  pipeline {
    address => Process
  }
}

Here is my dmesg.conf

input {
  pipeline {
    address => dmesg_log
  }
}

output {
        stdout { codec => rubydebug }
}

And for the record of how I created the fields variables, here is my filebeat.yml

filebeat.inputs:

- type: log
  enabled: true
  paths:
    - /root/process.log
  fields:
    Process: true

- type: log
  enabled: true
  paths:
    - /var/log/dmesg
  fields:
    dmesg: true

Everything appears to be working. Both the Process.log and the dmesg are being sent to different pipelines and running without any issues.

Once again, thank you for taking the time to help me to not only fix, but understand the issues that I had.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.