Elastic Security: Strategies for Analyzing Large Files (Over 500MB)

I am new to Elastic. I discovered that with the File Data Visualizer, we can upload files up to 500MB. Is there an "official" strategy to analyze larger files, such as dividing the data into chunks and later combining them within Elastic using SQL, or uploading the file to AWS S3 and connecting Elastic to it? For now, I am focused on the machine learning aspect of Elastic Security, and I am wondering how to analyze jobs that require older data. Sometimes, data is automatically moved to AWS S3 or other storage solutions to reduce costs. Thank you for everything!

@Illya_Bjazevic , you can bump that up to 1GB via the Kibana advanced settings:

Otherwise, you simply ingest any file, of any size, using Elastic Agent and the Custom Log integration.

1 Like

Awesome! Thanks for the fast help!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.