How to implement forked path pattern pipeline

Hi, I need some guidance understanding how to implement a forked path pattern pipeline as described here.

I'm running Logstash in a Kubernetes environment and have the following configured in my logstash.yml file: ['https://escluster_ip:443'] "password" "username" true [pipelineA, pipelineB, pipelineC]
xpack.monitoring.elasticsearch.hosts: ['https://escluster_ip:443']
xpack.monitoring.elasticsearch.password: "password"
xpack.monitoring.elasticsearch.username: "username"
xpack.monitoring.enabled: true

When the Logstash container starts up it triggers pipelineA, pipelineB, and pipelineC to start up.

I want to have the data coming out of pipelineA reparsed with a new pipeline (pipelineD). How do I do that? How would I reference pipelineA's output inside of pipelineD? How do I configure pipelineA's output to send the data to pipelineD? Also, will I need to add pipelineD to the logstash.yml file above? Once pipelineD is finished parsing the data it should send it to my ES cluster.

Do I need to put the "queue.type: persisted" and "config.string: ..." configs in my logstash.yml file?

Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.