Sending data to multiple outputs with different parsing , we are using s3 input plugin

This is the setup that we need:
Input Plugin - S3

Output plugin: Elastic Search and S3

However, the requirement is that the data that will be going to Elasticsearch won't be the same as that of the data going to S3.

The data going to s3 will need more mutation in comparison to the data going to Elasticsearch.

I don't see that we can use a filter(for mutating) in the output option of logstash.

Can you please help on how this can be achieved?

It is not possible to use filters in the output, every transformation in the data needs to be done in the filter block.

What you need is to use multiple pipelines, check the documentation for the forked-path pattern.

You will need to have 3 pipelines, one will have your input and common filters and will output to the other two pipelines, one of them will output to elasticsearch, and the other will have the extra mutations and output to s3.

You need something like this:

pipelines.yml

- pipeline.id: input-s3
  path.config: /path/to/input-s3.conf
- pipeline.id: output-es
  path.config: /path/to/output-es.conf
- pipeline.id: output-s3
  path.config: /path/to/output-s3.conf

input-s3.conf

input {
  s3 { s3 input configuration}
}
filter {
  common filter configuration
}
output {
  pipeline {
    send_to => ["output-es","output-s3"]
  }
}

output-es.conf

input {
  pipeline {
    address => "output-es"
  }
}
filter {
  filters for es output only
}
output {
  elasticsearch {
    your es output
  }
}

output-s3.conf

input {
  pipeline {
    address => "output-s3"
  }
}
filter {
  filters for s3 output only
}
output {
  s3 {
    your s3 output
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.