I am experimenting with the Elastic Stack on my laptop. I need to ingest a bunch of logs dating back a few years up to current time. I am using logstash to process the logs and send them to elasticsearch. I have limited space on my SSD but plenty of space on my HDD.
I would like to ingest the logs and allocate the indexes to either the SSD or HDD based on the timestamp of the logs, f.ex, if the logs are 2 weeks old or newer, they go to the SSD and if they are 2 weeks old or older, they go to the HDD. Is this possible?
You will need to have at laest 2 nodes,
1 node that will use SSD Storage and will be the hot node that receive the newest logs
1 node that will use HDD storage and will be the warm node that receive the old logs
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.