I am new to Elastic. I discovered that with the File Data Visualizer, we can upload files up to 500MB. Is there an "official" strategy to analyze larger files, such as dividing the data into chunks and later combining them within Elastic using SQL, or uploading the file to AWS S3 and connecting Elastic to it? For now, I am focused on the machine learning aspect of Elastic Security, and I am wondering how to analyze jobs that require older data. Sometimes, data is automatically moved to AWS S3 or other storage solutions to reduce costs. Thank you for everything!
@Illya_Bjazevic , you can bump that up to 1GB via the Kibana advanced settings:
Otherwise, you simply ingest any file, of any size, using Elastic Agent and the Custom Log integration.
Awesome! Thanks for the fast help!

