Hello,
I have 5 nodes under rhel 8.7, the system is hardened so the partitions /var /var/tmp /var/log /home /var/log/audit and /tmp are in noexec
I managed to work around the problem on the first two nodes by adding the option in /etc/sysconfig/elasticsearch
ES_JAVA_OPTS="-Djava.io.tmpdir=/etc/elasticsearch/tmp"
with a tmp directory in root:elasticsearch
I have done this configuration also on the other 3 nodes but elasticsearch does not start.
I get this error:
This node does not have access to the tmp directory you have configured. You'll need to pick a directory to which it has access and/or adjust the permissions on this directory to give it access.
On a hardened system it's usually more complicated than that.
Elasticsearch is not doing anything magical here, it's just trying to create a file with the given path and the OS is returning the error code EACCES indicating that it doesn't have permission to do so.
This depends entirely on the details of your system config, in particular how the hardening has been set up. As I said, there's no magic here, it's just asking to create a file via the open() syscall and the OS responds with an error. There's nothing Elasticsearch-specific about this, and your local sysadmin folk will likely be able to give you much more useful advice than we can.
I managed to correct the previous error.
I now have the last two nodes showing this
java.lang.ExceptionInInitializerError: null
at org.elasticsearch.bootstrap.JNANatives.definitelyRunningAsRoot(JNANatives.java:162) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.bootstrap.Natives.definitelyRunningAsRoot(Natives.java:57) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.bootstrap.Elasticsearch.initializeNatives(Elasticsearch.java:259) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.bootstrap.Elasticsearch.initPhase2(Elasticsearch.java:166) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:66) ~[elasticsearch-8.5.2.jar:?]
Caused by: java.lang.UnsupportedOperationException: Failed to allocate closure
at com.sun.jna.Native.registerMethod(Native Method) ~[jna-5.10.0.jar:?]
at com.sun.jna.Native.register(Native.java:1906) ~[jna-5.10.0.jar:?]
at com.sun.jna.Native.register(Native.java:1775) ~[jna-5.10.0.jar:?]
at com.sun.jna.Native.register(Native.java:1493) ~[jna-5.10.0.jar:?]
at org.elasticsearch.bootstrap.JNACLibrary.<clinit>(JNACLibrary.java:38) ~[elasticsearch-8.5.2.jar:?]
I have a last error at the launch of the cluster, the whole communicates correctly but I have this error on a node :
[2023-01-04T17:33:49,537][ERROR][o.e.i.g.GeoIpDownloader ] [node-2] exception during geoip databases update
java.net.UnknownHostException: geoip.elastic.co
at sun.nio.ch.NioSocketImpl.connect(NioSocketImpl.java:560) ~[?:?]
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:327) ~[?:?]
at java.net.Socket.connect(Socket.java:666) ~[?:?]
at sun.security.ssl.SSLSocketImpl.connect(SSLSocketImpl.java:304) ~[?:?]
at sun.net.NetworkClient.doConnect(NetworkClient.java:178) ~[?:?]
at sun.net.www.http.HttpClient.openServer(HttpClient.java:531) ~[?:?]
at sun.net.www.http.HttpClient.openServer(HttpClient.java:636) ~[?:?]
at sun.net.www.protocol.https.HttpsClient.<init>(HttpsClient.java:264) ~[?:?]
at sun.net.www.protocol.https.HttpsClient.New(HttpsClient.java:378) ~[?:?]
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.getNewHttpClient(AbstractDelegateHttpsURLConnection.java:193) ~[?:?]
at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1241) ~[?:?]
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1127) ~[?:?]
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:179) ~[?:?]
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1661) ~[?:?]
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1585) ~[?:?]
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:529) ~[?:?]
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getResponseCode(HttpsURLConnectionImpl.java:308) ~[?:?]
at org.elasticsearch.ingest.geoip.HttpClient.lambda$get$0(HttpClient.java:46) ~[?:?]
at java.security.AccessController.doPrivileged(AccessController.java:569) ~[?:?]
at org.elasticsearch.ingest.geoip.HttpClient.doPrivileged(HttpClient.java:88) ~[?:?]
at org.elasticsearch.ingest.geoip.HttpClient.get(HttpClient.java:40) ~[?:?]
at org.elasticsearch.ingest.geoip.HttpClient.getBytes(HttpClient.java:36) ~[?:?]
at org.elasticsearch.ingest.geoip.GeoIpDownloader.fetchDatabasesOverview(GeoIpDownloader.java:155) ~[?:?]
at org.elasticsearch.ingest.geoip.GeoIpDownloader.updateDatabases(GeoIpDownloader.java:143) ~[?:?]
at org.elasticsearch.ingest.geoip.GeoIpDownloader.runDownloader(GeoIpDownloader.java:274) ~[?:?]
at org.elasticsearch.ingest.geoip.GeoIpDownloaderTaskExecutor.nodeOperation(GeoIpDownloaderTaskExecutor.java:102) ~[?:?]
at org.elasticsearch.ingest.geoip.GeoIpDownloaderTaskExecutor.nodeOperation(GeoIpDownloaderTaskExecutor.java:48) ~[?:?]
at org.elasticsearch.persistent.NodePersistentTasksExecutor$1.doRun(NodePersistentTasksExecutor.java:42) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:892) ~[elasticsearch-8.5.2.jar:?]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:26) ~[elasticsearch-8.5.2.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
to authenticate my cluster with kibana I use a token that I created like this : bin/elasticsearch-service-tokens create elastic/kibana kibana-to-es-token
then I copied the token file on all the nodes of the cluster and I set it in kibana :
elasticsearch.serviceAccountToken:xxxxxx
but I error message in kibana : [security_exception] Reason: failed to authenticate service account [elastic/kibana] with token name [kibana-to-es-token]
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.