Now, In addition to Console, the functions are working properly!
Look at es cluster log when Console request:
internalDispatchRequest path:/gravity.process.201701/_search, method:GET, headers:{"content-length":"0","x-forwarded-proto":"http","Connection":"close","x-forwarded-port":"56110","Host":"x.x.x.x","x-forwarded-for":"x.x.x.x"}, params:{}, content:, credentials:,
The header of the request which es node gets dosen't contains any Authorization info.
When some other requests such as '/.kibana/config/_search' execute(request from Kibana itself, not console), it logs that:
internalDispatchRequest path:/.kibana/config/_search, method:POST, headers:{"Authorization":"Basic [REDACTED]","Connection":"keep-alive","content-type":"application/json","Host":"x.x.x.x:8080","Content-Length":"77"}, params:{}, content:{"size":1000,"sort":[{"buildNum":{"order":"desc","unmapped_type":"string"}}]}, credentials:,
Howerver, the the header contains Authorization info.
You shouldn't need to specify both a username and password AND an Authorization header. Was setting elasticsearch.username/elasticsearch.password not working for you?
Also, note that the header you posed is simply base64 encoded, which effectively means it's just plaintext. As such, it's pretty easy to get your real username and password out of there by simply base64 decoding it.
Also, note that the header you posed is simply base64 encoded, which effectively means it's just plaintext. As such, it's pretty easy to get your real username and password out of there by simply base64 decoding it.
Yes, But if I just specify username and password , the Kibana can startup, Howerver, I can't get any data from es until configuring the customHeader.
I meant that as, what you posted was not a secret, anyone could have come along and grabbed your info from that. That's why an admin updated the post as well.
So using Kibana and Elasticsearch don't work simply by specifying the elasticsearch.username/password? How are you putting basic auth on the ES side? Using X-Pack, or something you rolled on your own?
If you are using X-Pack, things will just work. If you have rolled your own, maybe something isn't configured correctly. The username and password should be used to create and send the Authorization automatically.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.