Failed to perform any bulk index operations: 403 Forbidden:

Hello Search Guru's

I am getting this error, i have a AWS ELK POC cluster(1 node), i am using filebeat to ingest data, getting this error
With curl i can create index, any suggestions, thanks

elasticsearch/client.go:317 Failed to perform any bulk index operations: 403 Forbidden:

403 Forbidden

Forbidden

You don't have permission to access /_bulk on this server.

If you are using AWS ES you may need to send the data through Logstash and use the amazon_es output as AWS ES as far as I know does not support the standard HTTP auth.

Elastic Cloud Elasticsearch Service is available on AWS and does not have this limitation and works fine directly with Beats.

Thanks but when i publish to logstash, i am getting below error

2018-10-22T15:45:27.929Z DEBUG [logstash] logstash/async.go:159 596 events out of 596 events sent to logstash host ecs-url:443. Continue sending
2018-10-22T15:45:27.929Z INFO [publish] pipeline/retry.go:189 retryer: send unwait-signal to consumer
2018-10-22T15:45:27.929Z INFO [publish] pipeline/retry.go:191 done
2018-10-22T15:45:27.943Z ERROR logstash/async.go:252 Failed to publish events caused by: lumberjack protocol error
2018-10-22T15:45:27.943Z DEBUG [transport] transport/client.go:131 closing
2018-10-22T15:45:27.943Z ERROR logstash/async.go:252 Failed to publish events caused by: lumberjack protocol error
2018-10-22T15:45:27.943Z INFO [publish] pipeline/retry.go:166 retryer: send wait signal to consumer
2018-10-22T15:45:27.943Z INFO [publish] pipeline/retry.go:168 done
2018-10-22T15:45:27.958Z DEBUG [logstash] logstash/async.go:159 579 events out of 579 events sent to logstash host ecs-url:443. Continue sending
2018-10-22T15:45:27.958Z DEBUG [logstash] logstash/async.go:116 close connection
2018-10-22T15:45:27.958Z DEBUG [logstash] logstash/async.go:116 close connection
2018-10-22T15:45:27.958Z ERROR logstash/async.go:252 Failed to publish events caused by: client is not connected
2018-10-22T15:45:28.959Z ERROR pipeline/output.go:109 Failed to publish events: client is not connected
2018-10-22T15:45:28.959Z DEBUG [logstash] logstash/async.go:111 connect

What does your config look like?

#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
  # Array of hosts to connect to.
  #hosts: ["https://log-xxx.com:443"]

  # Optional protocol and basic auth credentials.
#  protocol: "https"
#  username: "elastic"
#  password: "changeme"

#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  #hosts: ["localhost:5044"]
  hosts: ["log-xxx.com:443"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]
  ssl.certificate_authorities: ["/etc/pki/tls/chain.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

If i want to use elasticsearch output, i will comment logstash portion of the config.


I also see ping request failed with: 403 Forbidden: {"message":"Authorization header requires 'Credential' parameter. Authorization header requires 'Signature' parameter. Authorization header requires 'SignedHeaders' parameter. Authorization header requires existence of either a 'X-Amz-Date' or a 'Date' header


Thanks

Where do you have Logstash running? Is it really listening on port 443? What does the Logstash config look like?

I pointed to same AWS endpoint of elasticsearch AWS hosted service, we have a nginx proxy infront of that service.

Regards
Venkatesh

Then you can not use the Logstash output.

Then how can i make filebeat work with AWS elasticsearch service, using curl i am able to create an index but filebeat is throwing me errors.

Thanks

How are you authenticating using curl? Filebeat supports HTTP basic auth, which I think AWS ES does not. You might be able to do something at your proxy layer if you have one, or introduce a Logstash node into the flow.

I ran below command as root, filebeat also running as root on the same node
curl -v -k -H 'Content-Type: application/json' -XPUT "https://aws-es-url.com/test/external/1?pretty" -d '{"name": "CreateIndex", "type": "test1"}'

I can see this index & data in kibana

Regards

2018-11-02T23:32:20.681Z DEBUG [monitoring] elasticsearch/elasticsearch.go:197 Monitoring could not connect to elasticsearch, failed with X-Pack capabilities query failed with: 403 Forbidden: {"message":"Authorization header requires 'Credential' parameter. Authorization header requires 'Signature' parameter. Authorization header requires 'SignedHeaders' parameter. Authorization header requires existence of either a 'X-Amz-Date' or a 'Date' header. Authorization=Basic YmVhdHNfc3lzdGVtOg=="}

DEBUG [elasticsearch] elasticsearch/client.go:730 GET https://log-central.xxx.com:443/_xpack?filter_path=features.monitoring.enabled <nil>

We tested bulk upload using curl, it works fine but not with filebeat, any suggestions

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.