Send only one field in pipeline to pipeline communication

I'd like my 1st pipeline to send the event to an Elasticsearch output and only a field's event to the input of a 2nd pipeline. Is it possible to send only one field in a pipeline and not a complete event?

Remove those extra fields using mutate filter.

Hello, thank you for your reply. I can't remove them because I need all field for my elasticsearch output.

The pipeline to pipeline using the pipeline output does not support the codec option to change the format of the output message, which would allow you to send just one field to a pipeline and the entire message to the other pipeline.

To do this you will need to duplicate your event using the clone filter, the prune filter and some conditionals to change only the event that was cloned.

Something like this:

filter {
    clone {
        clones => ["cloned"]
    if [type] == "cloned" {
        prune {
            whitelist_names => [ "field-you-want-to-send" ]
output {
    if [type] == "cloned" {
        pipeline { your pipeline-to-pipeline}
    if [type] != "cloned" {
        your other output

If you have two pipelines then just remove the fields (using mutate or prune) in one of them. The other one will send the complete event to elasticsearch. This is the forked-path pattern for pipeline-to-pipeline communication.

Thank you for your answers.

Hi @Badger, thank you very much for your answer. I need to make sure that my event is passed to the first output (Elasticsearch) before sending it (in a light version, just the identifier) to the next pipeline. If I use the forked path pattern, I won't be able to have this verification, in this case is using clone filter with conditioning outputs the only solution?

You cannot be certain of the order in which events are written to outputs. You could introduce a sleep filter into the second pipeline but that is a very crude way to address the issue and still not guaranteed to work.

Thanx. So, would you still advise me to use the forked path ? I have a number of transformation / enrichment filters in the pipeline that indexes in Elastic, so I was concerned that if I made a dedicated pipeline to send the identifier to the second pipeline (with forked path), the processing time would be much faster and therefore my full, enriched event would not have had time to be processed and sent to Elastic. The fact that they were in the same pipeline with two outputs ensured that the event was processed (with all the filters) and would be sent to Elasticsearch at almost the same time.

Well I guess you could do the fork after the enrichment, and then strip all the enrichment off in the second pipeline. But it feels wrong.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.