Coupling Filebeat prospector with Logstash pipeline

This would require some kind of filtering + forward to another pipeline in Logstash. I don't think this is possible as is with Logstash. e.g. see: Logstash 6 - Multiple Pipelines for One Input

In filebeat in the prospector you can either use the fields setting to pass additional fields for filtering or use the pipeline setting in filebeat. The later is somewhat private and used by filebeat to set the ingest node pipeline name for the Elasticsearch output. The pipeline setting of each prospector is available in the event via [@metadata][pipeline]. You can still have separate processing 'pipelines/filters' so to say, by guarding those on the pipeline contents. Each prospector specific processing would be wrapped into this:

if [@metadata][pipeline] {
  mutate {
    remove_field => ["[@metadata][pipeline]"]
  }

  # custom per prospector filters
}

If you really filter on [@metadata][pipeline], make sure to remove it as well. With 6.0 we ask users to set the ingest node pipeline via [@metadata][pipeline]. Or rather use a custom field.