We are using Elasticsearch version 6.4.2. to save access logs for our website. The logs are very heavy in nature, and take up almost 3-4 GB of storage per day. Sometime these logs are required by our IR team to investigate Web based attacks, what is the best way to export bulk data like this into csv? I know there is a limit on xpack reporting with Kibana to 10 MB, I tried increasing it to 100 MB as well as increasing the http_max_content_length to 200 mb but most of the times Elasticsearch would stop with a Java memory heap size error.
We have not bought the commercial license for X-Pack.
What is the best way to get these logs, which will be about 500mb - 1gb out of elasticsearch?
1 Like
You can use logstash for this I think.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.