Unable to store index into S3

While taking back up of .kibana index to S3 bucket getting below error, kindly help.

curl -XPUT '1.1.1.1:9200/_snapshot/siemdefaultrepo/kibana_14june2016?wait_for_completion=true&pretty' -d '{"indices":".kibana","partial":"true"}'
{
  "snapshot" : {
    "snapshot" : "kibana_14june2016",
    "version_id" : 2010199,
    "version" : "2.1.1",
    "indices" : [ ".kibana" ],
    "state" : "PARTIAL",
    "start_time" : "2016-06-14T05:56:52.442Z",
    "start_time_in_millis" : 1465883812442,
    "end_time" : "2016-06-14T05:56:52.848Z",
    "end_time_in_millis" : 1465883812848,
    "duration_in_millis" : 406,
    "failures" : [ {
      "index" : ".kibana",
      "shard_id" : 0,
      "reason" : "IndexShardSnapshotFailedException[The operation is not valid for the object's storage class (Service: Amazon S3; Status Code: 403; Error Code: InvalidObjectState; Request ID: 92F1523F13E582F9)]; nested: AmazonS3Exception[The operation is not valid for the object's storage class (Service: Amazon S3; Status Code: 403; Error Code: InvalidObjectState; Request ID: 92F1523F13E582F9)]; ",
      "node_id" : "DnFayvNKTHmWC34xy_mLxg",
      "status" : "INTERNAL_SERVER_ERROR"
    } ],
    "shards" : {
      "total" : 1,
      "failed" : 1,
      "successful" : 0
    }
  }
}

Please format your code using </> button. Updating your post.

INTERNAL_SERVER_ERROR is coming from S3 API. May be you have more details in elasticsearch logs?
Check also that those credentials can read and write data to the S3 repo.