I'm new to ELK and all the stuff. I've been reading the documentation to have a better understanding of elastic's products and, as far I've seen, couldn't find a solution for the case below:
I'm trying to receive syslog from 3rd party embedded systems directly into logstash, compress it and send to another logstash which will decompress and send to our log management.
Simplifying:
enviroment #1(syslog server -> logstash) -> internet(compressed data) -> environment #2 (logstash -> uncompressed data -> log management).
The idea is to compress data focusing on saving bandwidth.
Also, I'm trying to cipher this data. Will this mess up the idea above?
Thanks in advance.
PS: English isn't my first language, so, sorry for any misspelled word or unsense sentence.
You can use a lumberjack output plugin combined with a beats input plugin. This combination will compress and encrypt data being transferred between the Logstash instances.
Hi Christian,
Thank you for your suggestion.
I was reading lumberjack documentation and the field I need to configure about compression it's the flush_size, right?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.