Dependable output in bulks to local and remote elasticsearch

I am running several instances of a local ELK Stack. Hence, logstash imports, filters and then outputs to a local elasticsearch:

output {
  elasticsearch {
    hosts => ["localhost:9200"]
  }

Due to space constraints, indices older than 7 days are deleted at the local elasticsearch. I want to archive the data from all local ELK Stacks at one central, remote stack. I need a solution that is dependant (deals with high network usage peaks etc), offers some kind of authentication and, as it is quite a lot of data, a transmission in aggregates of i.e. 5min, 1h.

What is the best practice for this task? I am not sure if I should use the 1) logstash-output plugins, 2) elasticdump 3) beats, 4) or something completely different (rsync?).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.