Any way to compress output before sending to elasticsearch?

Hi All,

Still new to ELK. I have logstash running on my machine generating 1mbps of plain text data(with lot of redundancy of text in logs). Is there a way to encode/compress this output so it consumes less bandwidth and at the same time I need/want to decode this on cloud and feed it my elasticsearch.

Logstash(on my machine low resources)(compressed data) --> (decode it some how) elasticsearch(on my server)

Logstash(on my machine low resources)(compress data output plugin) --> (decode data input plugin)(logstash)(on centralized server)(elastic search output plugin)---> elasticsearch(on my server)

Are there any ways or plugins to address my problem. theme is to save network bandwidth.


Im also interested! Anyone that could help? Thanks!

logstash forwarder (filebeat) has compression capability. Can you use filebeat for compressing the logs to ship to LS sitting on ES box or directly to ES?

HI Michael,

Thanks a lot for your answer. I dont have the possibility to install LS on the ES box, but I will explore the filebeat documentation to see if it can send the logfiles already compressed and directly to ES.