Query on Logstash to Elasticsearch and third party

Hi,

I have an isue which should be simple to resolve but can not.

Basically we send all our log data to a beat on a server in the DMZ an then that forwards onto Elastic.

However we now have a requirement to forward onto the same data to a third party. We can not configure devices to end to multiple servers and can only use filebeat to send to one output which is elastic.

So for the solution how can I manage this ? the file beat must stay but has to send concurrently to elastic and Kibana from the same server.

Is there a possible solution ?

Cheers,
Ian

Filebeat only has one output, if you send your data to one Elasticsearch cluster you cannot send it anywhere else.

The easiest solution would be to send your data to a Logstash server and this Logstash server send the data to your current Elasticsearch cluster and your third-party, but it seems that you can not do that.

The other way would be to query your Elasticsearch cluster using Logstash and send the data to your third-party, but this would not be in real time and can be expensive and add some load to your Elasticsearch cluster depending on the volume of data.

Another solution would be just to run multiple filebeats, one for each output.
as long as you keep the data folder separate it should work

Thanks and I can run logstash on a separate server and reading I believe I use the clone function to do that. However reading I have seen people mention that if one output has issues it delays the other output is this correct ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.