Coupling Filebeat prospector with Logstash pipeline


We are moving to ELK 6.0 and are very pleased that we can now define filebeat prospectors and logstash pipelines in their own files. This makes it easier for our teams to create and deploy their own configurations (and have filebeat and logstash dynamically reload the configuration).

We have filebeat running with multiple prospectors, but a prospector can't define it's own output. So it looks like we are forced to use the same output.
We have multiple pipelines in logstash, but since all events from filebeat are delivered to the same input, we can't use them.

Googling the problem it seems the recommended solution is to have multiple instances of filebeat, but this is not optimal.

Is there anyway for us to have a prospector deliver to a specific pipeline, without having to alter the "shared" configuration files? Or for logstash to route to a specific pipeline, again without altering the "shared" configuration? Maybe something like adding a tag in the prospector and filtering on that in the "input" of the pipeline?

Best regards

This would require some kind of filtering + forward to another pipeline in Logstash. I don't think this is possible as is with Logstash. e.g. see: Logstash 6 - Multiple Pipelines for One Input

In filebeat in the prospector you can either use the fields setting to pass additional fields for filtering or use the pipeline setting in filebeat. The later is somewhat private and used by filebeat to set the ingest node pipeline name for the Elasticsearch output. The pipeline setting of each prospector is available in the event via [@metadata][pipeline]. You can still have separate processing 'pipelines/filters' so to say, by guarding those on the pipeline contents. Each prospector specific processing would be wrapped into this:

if [@metadata][pipeline] {
  mutate {
    remove_field => ["[@metadata][pipeline]"]

  # custom per prospector filters

If you really filter on [@metadata][pipeline], make sure to remove it as well. With 6.0 we ask users to set the ingest node pipeline via [@metadata][pipeline]. Or rather use a custom field.

Hi Steffen

Thank you for the reply :slight_smile:

We have a team managing our ELK cluster and we also have a number of software development teams. I would love if a software development team could deploy a service/site and include a filebeat prospector and logstash pipeline (with automatic reloading of configuration and no editing of the shared configuration). This is already possible, but we just need the ability to link the two.
Is that a feature Elastic would consider adding? (as I understand your suggestion, we can't get all the way where we want to be).

Best regards

Maybe this is better to be asked in the Logstash forum.

Logstash supports the a conf.d folder to load multiple configuration files. See path.config setting in logstash.yml file. (e.g.: Logstash Directory Layout).

Yet I'm not sure if automatic config reloading works with conf.d, e.g. by just adding/removing files with filter definitions in conf.d. If so, and given they are all added to the same pipeline in logstash, the if-guard trick shown might work.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.