Hi,
I am fairly new to ELK and I am seeking some advice. I have inherited an ELK stack which is processing logs from multiple sources and I am looking into forwarding some of those logs to another instance of Elasticsearch.
Our current set up is (Cent-OS 7):
Logstash -> Elasticsearch (5.6) -> Kibana
What I want to achieve is to forward some of the logs to a new instance of Elasticsearch and also keep the logs in my current stack. What would be the best way to achieve this?
I was told in the elasticsearch forum that I would have to "duplicate the event to a second output". Any advice is greatly appreciated. Thanks.
Thanks AquaX,
I was able to forward all the output to my 2nd cluster but I believe what I need is a bit different. I'll try to explain:
I want to keep processing all my logs from different sources with logstash and then send them to the localhost instance of elasticsearch. (This is my current setting which I want to keep untouched).
I want to now processs the sames logs with logstash but "extract" and "modify" the same logs, and then send them to my 2nd instance of elasticsearch.
I see this like having another pipeline processing the same logs but using a different output. Is this possible/reasonable?
In my output file I have tried to filter by type and then send those types to my 2nd instance of elasticsearch.
I also configured a completely new input with an output to my 2nd instance of elasticsearch.
In both scenarios my filter did not work and all the information (from all my inputs) was sent to my 2nd instance of elasticsearch as well.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.