How to send same data to multiple elastic clusters with logstash output

All,

We have requirement where we need to read data from kafka topics from logstash and send all data to cluster 1 and few data to cluster 2. Please let us know is it possible and it would really help us if any one provide sample code.

Thanks

You can have multiple outputs for the same pipeline and you can use conditionals to decide which events go to which outputs.

Thanks for reply.

i want all events should go cluster 1 and only few events should go to cluster 2.

does below configuration works?

output {
stdout{codec => "rubydebug"}
elasticsearch
{ hosts => ["dev cluster"]
user => "logstash_user"
password => "XXXXX"
index => "%{index_name}"
}
if "okta" in [tags] {
elasticsearch {
hosts => ["okta cluster"]
user => "logstash_user"
password => "XXXXX"
index => "%{index_name}"
}
}
}

Thanks

That looks reasonable, yes.

Thank you. Will test this let's see how it goes.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.