I have an isue which should be simple to resolve but can not.
Basically we send all our log data to a beat on a server in the DMZ an then that forwards onto Elastic.
However we now have a requirement to forward onto the same data to a third party. We can not configure devices to end to multiple servers and can only use filebeat to send to one output which is elastic.
So for the solution how can I manage this ? the file beat must stay but has to send concurrently to elastic and Kibana from the same server.
Filebeat only has one output, if you send your data to one Elasticsearch cluster you cannot send it anywhere else.
The easiest solution would be to send your data to a Logstash server and this Logstash server send the data to your current Elasticsearch cluster and your third-party, but it seems that you can not do that.
The other way would be to query your Elasticsearch cluster using Logstash and send the data to your third-party, but this would not be in real time and can be expensive and add some load to your Elasticsearch cluster depending on the volume of data.
Thanks and I can run logstash on a separate server and reading I believe I use the clone function to do that. However reading I have seen people mention that if one output has issues it delays the other output is this correct ?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.