Shipping all ElasticSearch logs to AWS S3 on Regular Basis

As title goes, I am currently looking for a solution to ship all ElasticSearch logs to AWS S3 on a regular basis, with the aim of having my EC2 instance hosting the ElasticStack (yes, I have only got one as a trial into exploring the Elastic Stack technologies as a start) having an "infinite" amount of space, i.e. I would never have to worry about the EBS attached to the instance being full. This aim is motivated by the desire for this project to be more/completely cloud-native.

What I am envisioning is that the logs will be sent to AWS S3 on a periodic basis, and after an interval, these logs are cleared from the EC2 instance. Hence, on this basis, my aim would be achieved.

(I've read about using the Curator plugin for the purging of logs after x time period.)

My question is: is this the best approach towards achieving my aim stated above? If not, what would your recommendation be?

(I have also read about the Amazon Elasticsearch Service, but would not be looking towards using it because of costs)

Thank you for your time!

1 Like

Bump for this!

What about using Snapshots ? And snapshot lifecycle management ?

And with 7.10, there's the feature of searchable snapshots.

(I have also read about the Amazon Elasticsearch Service, but would not be looking towards using it because of costs)

I'd look anyway at Cloud by Elastic, also available if needed from AWS Marketplace ?

Cloud by elastic is one way to have access to all features, all managed by us. Think about what is there yet like Security, Monitoring, Reporting, SQL, Canvas, Maps UI, Alerting and built-in solutions named Observability, Security, Enterprise Search and what is coming next :slight_smile: ...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.