Hi,
I am fairly new to Elasticsearch and I am seeking some advice. I have inherited an ELK stack which is processing logs from several sources and I am looking into forwarding some of those logs to another instance of Elasticsearch.
Our current set up is (Cent-OS 7):
Logstash -> Elasticsearch (5.6) -> Kibana
What I want to achieve is to forward some of the logs to a new instance of Elasticsearch and also keep the logs in my current stack. What would be the best way to achieve this?
I've been reading lots but the documentation is somehow confusing. Thanks in advance for any advice.
Logstash can send to multiple outputs, so the simplest option is to configure an additional elasticsearch output in logstash to send some logs to your new cluster.
If you have existing indices in Elasticsearch that you want to copy to a new cluster, then you either use snapshot & restore to export them from your old cluster into a new cluster, or alternatively on your new cluster run reindex-from-remote to copy the data into your new cluster.
Yes, you would just need to tell Logstash to duplicate the event to a second output (potentially with a filter applied).
If you need more specific advice, it would be best to ask in the #logstash forum - the people there will be better able to help you out.
If you want to recreate the indices and preserve the existing data, then reindex is what you're after.
If you going to pump the data through again via logstash then it doesn't sound like you will need to worry about reindex.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.