ElasticSearch for +500gb Audit Trail

We are designing new applications and we are thinking of solutions for audit trail. We need to store 10 years of user transactions (clicks, changes, text they write, etc.). On our current SQL DB this is about 400 gb of information. Is ElasticSearch the tool for something like this? Or should we use a sql/nosql db to store the data and then use ES as a search service?

You asked this on reddit too right? I thought it was familiar :slight_smile:

I'd provide the same answer as what you got there. If the 400GB is 10 years of data then Elasticsearch will cope. If that is a single year, which means you end up with 40TB of data, then Elasticsearch will cope. If you are talking about 100GB a day, then it's going to be costly and you may want to look at something else.

Generally, data in Elasticsearch is considered hot/warm, in that it's available for search in near-realtime and it costs resources to maintain that. If you are consuming massive amounts of data it may not make sense to store that in such a manner for the longer term.

1 Like

I did ask on reddit! thank you very much for your answer. In fact someone also mentioned that indexes are only compatible one version back? So thinking of a long term solution this might not be the correct tool for cold storage.

What would be a best practice with long term cold data? I need elastic in the equation because it really meets our needs. Maybe having a MongoDB with all the data and somehow link it to ES? Does that makes sense?

Thanks for your knowledge.

Generally we see people storing logs separately in something like S3/ceph (or even tape) for longer storage.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.