Hi, I need some guidance understanding how to implement a forked path pattern pipeline as described here.
I'm running Logstash in a Kubernetes environment and have the following configured in my logstash.yml file:
xpack.management.elasticsearch.hosts: ['https://escluster_ip:443']
xpack.management.elasticsearch.password: "password"
xpack.management.elasticsearch.username: "username"
xpack.management.enabled: true
xpack.management.pipeline.id: [pipelineA, pipelineB, pipelineC]
xpack.monitoring.elasticsearch.hosts: ['https://escluster_ip:443']
xpack.monitoring.elasticsearch.password: "password"
xpack.monitoring.elasticsearch.username: "username"
xpack.monitoring.enabled: true
When the Logstash container starts up it triggers pipelineA, pipelineB, and pipelineC to start up.
I want to have the data coming out of pipelineA reparsed with a new pipeline (pipelineD). How do I do that? How would I reference pipelineA's output inside of pipelineD? How do I configure pipelineA's output to send the data to pipelineD? Also, will I need to add pipelineD to the logstash.yml file above? Once pipelineD is finished parsing the data it should send it to my ES cluster.
Do I need to put the "queue.type: persisted" and "config.string: ..." configs in my logstash.yml file?
Thank you.