Elasticsearch-hadoop Connector

I want to connect to an Elasticsearch [7.5] cluster with elasticsearch-hadoop using the SSL protocol. By providing the user and password with curl :
curl -XPUT -k -u "user:pass" https://xxxxxxxxxxx:9200/prefix-test
curl -XGET -k -u "user:pass" https://xxxxxxxxxxx.com:9200/prefix-test
I am able to create an index and get a response :
Here "user" can only read or write to indexes prefixed by "prefix".

But when I try to connect to this cluster using elasticsearch-hadoop [7.5] with Spark [2.3.1] in Scala, with the following configuration :
"es.nodes": "xxxxxxxxxxx"
"es.port": "9200",
"es.net.http.auth.user": "user"
"es.net.http.auth.pass": "password"
"es.net.ssl": "true"
"es.index.auto.create": "yes"
"es.net.ssl.cert.allow.self.signed": "true"
"es.nodes.path.prefix": "/prefix"

Then I try to insert data with :

This give me :
org.elasticsearch.hadoop.rest.EsHadoopInvalidRequest:org.elasticsearch.hadoop.rest.EsHadoopRemoteException: index_not_found_exception: no such index [prefix]

And without the parameter "es.nodes.path.prefix": "/prefix" :
org.elasticsearch.hadoop.rest.EsHadoopInvalidRequest:org.elasticsearch.hadoop.rest.EsHadoopRemoteException: security_exception: action [cluster:monitor/main] is unauthorized for user [user]

What i am missing here ? How to use this prefix parameter ?

Could be relevant : Spark job is failing with authenticating with BASIC error

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.