Is it possible to feed Logstash data into another Logstash instance that connects to Elastic?
In my configuration I will have many remote locations hooked up to expensive satellite connections that I want to send netflow and syslog data to a central ELK stack.
The problem is that data usage is too high. I'm considering running Logstash at every remote location to collect all logs and remove fields I don't need before sending it to the central Logstash instance that will do some further parcing such as performing DNS lookups on netflow data (can't do this on the remote host as it will use too much data).
Is this possible and will this reduce data usage? I don't know what format Logstash will send data in and whether it is possible to apply compression.
For log files I might be able to use Beats but I don't believe there is a Beat for netflow data.
Of course, if there is a better way to achieve what I want, please tell me