Currently, I have a logstash pipeline that is forwarding logs to Elasticsearch. I am planning to send logfiles to S3 as a batch using the
aws s3 cli. What are some drawbacks to this approach? Is it possible to account for which logs have already been uploaded to S3?
Thanks for any help in advance.
Can you provide more context? It is not clear what you want to do and how this relate to the elastic stack.
If you want to send logs to AWS S3 using the aws cli tool, you should direct your question to a aws forum as there is no relation to elastic products.
I might have bulldozed through the question. I'm looking to implement an S3 AWS storage option for ES cluster. Any insights to the pros/cons of using S3 as a storage option for an ELK stack? Does it depend on data load?
This is not supported, you can only use S3 storage for snapshots.
The only way you can use s3 for Elasticsearch storage is either searchable snapshots, or snapshots (which cannot be searched).
Ah ok, thanks for that enlightenment.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.