All,
We have requirement where we need to read data from kafka topics from logstash and send all data to cluster 1 and few data to cluster 2. Please let us know is it possible and it would really help us if any one provide sample code.
Thanks
All,
We have requirement where we need to read data from kafka topics from logstash and send all data to cluster 1 and few data to cluster 2. Please let us know is it possible and it would really help us if any one provide sample code.
Thanks
You can have multiple outputs for the same pipeline and you can use conditionals to decide which events go to which outputs.
Thanks for reply.
i want all events should go cluster 1 and only few events should go to cluster 2.
does below configuration works?
output {
stdout{codec => "rubydebug"}
elasticsearch
{ hosts => ["dev cluster"]
user => "logstash_user"
password => "XXXXX"
index => "%{index_name}"
}
if "okta" in [tags] {
elasticsearch {
hosts => ["okta cluster"]
user => "logstash_user"
password => "XXXXX"
index => "%{index_name}"
}
}
}
Thanks
That looks reasonable, yes.
Thank you. Will test this let's see how it goes.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.