Aws plugin installation problem


In order to send my elasticsearch indexes to aws S3, I find myself having to install the aws-cloud plugin on my server

For that, I have to enter this order while being in the file /usr/share/elasticsearch:
sudo bin/plugin install elasticsearch/elasticsearch-cloud-aws/2.7.1

But I have two problems with this command. The first is that the /bin/plugin file does not exist for me. I only have a plugins folder or a /bin/elasticsearch-plugin file. So I guess the correct file is /bin/elasticsearch-plugin

So I tried to make this order:
sudo bin/elasticsearch-plugin install elasticsearch-cloud-aws/2.7.1

I put 2.7.1 because it seems to me that this is the latest version of the plugin, and I updated it a little elasticsearch so the versions should be compatible

The result I get with this command is:

sudo bin/elasticsearch-plugin install elasticsearch-cloud-aws/2.7.1
-> Installing elasticsearch-cloud-aws/2.7.1
-> Failed installing elasticsearch-cloud-aws/2.7.1
-> Rolling back elasticsearch-cloud-aws/2.7.1
-> Rolled back elasticsearch-cloud-aws/2.7.1
A tool for managing installed elasticsearch plugins

Non-option arguments:

Option             Description
------             -----------
-E <KeyValuePair>  Configure a setting
-h, --help         Show help
-s, --silent       Show minimal output
-v, --verbose      Show verbose output
ERROR: Unknown plugin elasticsearch-cloud-aws/2.7.1

Do you have an idea to help me solve this problem? Thank you in advance

That plugin lives here and is 6 years old - GitHub - elastic/elasticsearch-cloud-aws: AWS Cloud Plugin for Elasticsearch. I would suggest that is not the plugin you need.

Can you elaborate a little more on what you are trying to achieve.

Hi @warkolm, thank you for the answer

I currently use graylog, so I have graylog and elasticsearch installed on the same server. The goal would be to keep 1 year of logs.

The server has 1T of disk space but it is already almost saturated, because it receives on average 30G per day

The goal would be to keep a month of logs and then send them to a S3 storage

My configuration is normally completed on AWS, I created a bucket with in it my graylog folder, a user named "svc-graylog" who has all the rights for this bucket

I now have to do the configuration on my server, and I started to follow this documentation for this:

And I find myself stuck at the stage of installing the plugin

That blog is 6 years old, it's out of date :slight_smile:
Take a look at Snapshot and restore | Elasticsearch Guide [7.12] | Elastic

Thank you for your help @warkolm ,
I installed the plugin with the correct command
sudo bin/elasticsearch-plugin install repository-s3

However I can not quite understand the following, the configuration to make the connection with my bucket aws S3

Did you read Register a snapshot repository | Elasticsearch Guide [7.12] | Elastic?

I'd also read: Getting Started | Elasticsearch Plugins and Integrations [7.12] | Elastic

1 Like

@warkolm, yes I did thank you but there are several types of repository if I understood correctly and I do not see which one to choose

Hello @dadoonet thank you for your answer, I find it strange since we just give the name of the bucket but I don’t see any real link with our AWS

So what did you try so far?

If you scroll down on the doc page I linked to, you will see that there is a specific link to S3, which is what you seem to be after - S3 Repository Plugin | Elasticsearch Plugins and Integrations [7.12] | Elastic

@dadoonet I think I have everything configured on my S3, whether it is the creation of a SIEM bucket and a user who has all the rights on it.

On my graylog/elastisearch server I installed the plugin, now I need to configure

So I wonder if I need to do this part (Getting Started | Elasticsearch Plugins and Integrations [7.12] | Elastic) since my AWS is already created and configured on my S3 and if I should not go directly to this step:
Client Settings | Elasticsearch Plugins and Integrations [7.12] | Elastic

@warkolm actually, I did not see that we could move to the next page after installing the plugin, thank you

Those are the "same" step actually. Which is "how to register a repository".

The 1st page is the simplest way.
The 2nd page is when you need to configure the S3 client which is often the case I'd say.

I'm normally doing:

bin/elasticsearch-keystore add s3.client.default.access_key
bin/elasticsearch-keystore add s3.client.default.secret_key


PUT _snapshot/siem_repo
  "type": "s3",
  "settings": {
    "bucket": "siem"

I'd recommend checking the permissions listed here: Repository Settings | Elasticsearch Plugins and Integrations [7.12] | Elastic

the s3.client.default.access_keys and the 3.client.default.secret_key match the user I created in aws S3 to manage the siem bucket?
Or maybe I should create an access point on S3 I juste I just found out this

If you are using access and secret keys, yes you need to enter them in the keystore.
So the S3 Client which is used by the plugin will know how to access your S3 buckets.

You might have to define the region as well in elasticsearch.yml.

s3.client.default.region: us-east-1

And it's the access and secret key generated by the creation of the access point?

I believe those are credentials associated with your user account on AWS.
Don't remember exactly how AWS console works TBH :slightly_smiling_face:

Okay thank you, sorry I’m asking a lot of questions, but I can’t see in the doc how to fix the date, for example if I only want to send indexes that are more than a month old

It does not seem related to the original question which is about plugin installation, it is?

If it's not, could you please open a new discussion and explain exactly what you are meaning?