I am fairly new to this but I have found that you can pull an enormous amount of data into Elastic pretty quickly. The hard bit is getting your head around the various ins and outs and also making the data easy to read in Kibana.
You have missed out the key variables though. How long depends on two main variables.
1 - What is the log data?
2 - How big is the log data?
"Six months of log data" is a very broad statement. What are the logs from - one application or a whole company wide infrastructure covering multiple types and hosts? Also, what is the total size (uncompressed) or all logs?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.