I'm new to Elasticsearch and have couple questions that I can't seem to find the answers to.
Is there a way to configure Elasticsearch to compress or delete the logs after a certain number of days or when storage reaches a certain size?
I'm currently getting around 1.6GB of logs per day looking at my kibana "store.size", and it's adding up everyday. What are some ways I can approach on this? I guess this would refer to my question #1.
I played around with it a bit, it seems to be what I'm looking for. However, is it possible to set it up where it monitors kibana or elasticsearch and can i.e. automatically delete indices that are 10 days old? Without me having to manually run curator.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.