ElasticSearch usage for logs with big size of entries


I think about usage of ElasticSearch for logging.
System generates about 5-10k log entries per second, average, total size is about 500 Mb per second. Most of log entries are small, but some log entries have size in range 300-500 kb. Such log entries have data in xml format. In case of growth of load, logging subsystem should be able to handle it, if add more servers.
Time to store logs in Elastic is undefined yet. I thinking about 1 month, but it is possible to set less duration, like 2 weeks.
So, logging system should be able to efficiently compress data, to not requires lots of drives, and be able to handle the amount of data.
Can ElasticSearch handle the amount of data efficiently, it is suitable for case when log entries have big size?

Hi @andsm, welcome to the Elastic community!

Your use case is possible and Elasticsearch can handle that volume of data. To minimise disk usage you might want to look into how to tune the index mappings, to make sure you don't waste space for your use case.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.