Upgrading to Elastic Search 7.1.x not able to get the cluster up

I have an existing Elastic 5.5.2 cluster that I am trying to upgrade to 7.1.x. The ssl was not enabled on the old cluster as it was in a non-prod environment
I have changed elasticsearch.yaml to the following configuration now as without that I was not able to create the users and roles using CURL. So, my configuration now is as follows.
xpack.security.transport.ssl.verification_mode: none
xpack.security.http.ssl.verification_mode: none
xpack.security.enabled: true
xpack.security.transport.ssl.enabled: true
xpack.security.http.ssl.enabled: true
xpack.security.transport.ssl.key: path to key.pem path
xpack.security.transport.ssl.certificate:path to crt
xpack.security.transport.ssl.certificate_authorities: path to ca.pem
xpack.security.http.ssl.key: path to key.pem path
xpack.security.http.ssl.certificate:path to crt
xpack.security.http.ssl.certificate_authorities: path to ca.pem

However while checking for the cluster health

"
"o.e.x.m.e.l.LocalExporter", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "waiting for elected master node [{elastic-node-02.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net}{trZDZL-lQQ21eeRs8s3_QA}{Vl4qESV7Rv2p4MUnhAAeng}{172.18.0.12}{172.18.0.12:9300}{ml.machine_memory=128671608832, ml.max_open_jobs=20, xpack.installed=true, zone=zone1}] to setup local exporter [default_local] (does it have x-pack installed?)" }
"o.e.x.s.a.TokenService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "refresh keys" }
"o.e.x.s.a.AuthenticationService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "Authentication of [elastic] was terminated by realm [reserved] - failed to authenticate user [elastic]" }
"o.e.x.s.a.AuthenticationService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "Authentication of [elastic] was terminated by realm [reserved] - failed to authenticate user [elastic]" }
"o.e.x.s.a.AuthenticationService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "Authentication of [elastic] was terminated by realm [reserved] - failed to authenticate user [elastic]" }
"o.e.x.s.a.TokenService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "refreshed keys" }
"o.e.x.m.e.l.LocalExporter", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "waiting for elected master node [{elastic-node-02.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net}{trZDZL-lQQ21eeRs8s3_QA}{Vl4qESV7Rv2p4MUnhAAeng}{172.18.0.12}{172.18.0.12:9300}{ml.machine_memory=128671608832, ml.max_open_jobs=20, xpack.installed=true, zone=zone1}] to setup local exporter [default_local] (does it have x-pack installed?)" }
"o.e.x.m.e.l.LocalExporter", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "waiting for elected master node [{elastic-node-02.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net}{trZDZL-lQQ21eeRs8s3_QA}{Vl4qESV7Rv2p4MUnhAAeng}{172.18.0.12}{172.18.0.12:9300}{ml.machine_memory=128671608832, ml.max_open_jobs=20, xpack.installed=true, zone=zone1}] to setup local exporter [default_local] (does it have x-pack installed?)" }
"o.e.x.s.a.AuthenticationService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "Authentication of [elastic] was terminated by realm [reserved] - failed to authenticate user [elastic]" }
"o.e.x.s.a.AuthenticationService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "Authentication of [elastic] was terminated by realm [reserved] - failed to authenticate user [elastic]" }
"o.e.x.s.a.AuthenticationService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "Authentication of [elastic] was terminated by realm [reserved] - failed to authenticate user [elastic]" }
"o.e.x.s.a.AuthenticationService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "Authentication of [elastic] was terminated by realm [reserved] - failed to authenticate user [elastic]" }
"o.e.c.s.ClusterApplierService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "removed {{elastic-node-03.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net}{U5UYED0dS5iR_4W9UlN8nw}{bicv-mNCQJqnW79wX2pkiQ}{172.18.0.11}{172.18.0.11:9300}{ml.machine_memory=128671608832, ml.max_open_jobs=20, xpack.installed=true, zone=zone1},}, term: 4, version: 18, reason: ApplyCommitRequest{term=4, version=18, sourceNode={elastic-node-02.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net}{trZDZL-lQQ21eeRs8s3_QA}{Vl4qESV7Rv2p4MUnhAAeng}{172.18.0.12}{172.18.0.12:9300}{ml.machine_memory=128671608832, ml.max_open_jobs=20, xpack.installed=true, zone=zone1}}" }
"o.e.c.s.ClusterApplierService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "master node changed {previous [{elastic-node-02.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net}{trZDZL-lQQ21eeRs8s3_QA}{Vl4qESV7Rv2p4MUnhAAeng}{172.18.0.12}{172.18.0.12:9300}{ml.machine_memory=128671608832, ml.max_open_jobs=20, xpack.installed=true, zone=zone1}], current }, term: 4, version: 18, reason: becoming candidate: onLeaderFailure" }
"o.e.c.NodeConnectionsService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "failed to connect to node {elastic-node-02.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net}{trZDZL-lQQ21eeRs8s3_QA}{Vl4qESV7Rv2p4MUnhAAeng}{172.18.0.12}{172.18.0.12:9300}{ml.machine_memory=128671608832, ml.max_open_jobs=20, xpack.installed=true, zone=zone1} (tried [1] times)" ,
,"stacktrace": ["org.elasticsearch.transport.ConnectTransportException: [elastic-node-02.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net][172.18.0.12:9300] connect_exception",
,"at org.elasticsearch.transport.TcpTransport$ChannelsConnectedListener.onFailure(TcpTransport.java:1299) ~[elasticsearch-7.1.1.jar:7.1.1]",
,"at org.elasticsearch.action.ActionListener.lambda$toBiConsumer$2(ActionListener.java:99) ~[elasticsearch-7.1.1.jar:7.1.1]",
,"at org.elasticsearch.common.concurrent.CompletableContext.lambda$addListener$0(CompletableContext.java:42) ~[elasticsearch-core-7.1.1.jar:7.1.1]",
,"at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859) ~[?:?]",
,"at java.util.concurrent.CompletableFuture.uniWhenCompleteStage(CompletableFuture.java:883) ~[?:?]",
,"at java.util.concurrent.CompletableFuture.whenComplete(CompletableFuture.java:2322) ~[?:?]",
,"at org.elasticsearch.common.concurrent.CompletableContext.addListener(CompletableContext.java:45) ~[elasticsearch-core-7.1.1.jar:7.1.1]",
,"at org.elasticsearch.transport.netty4.Netty4TcpChannel.addConnectListener(Netty4TcpChannel.java:100) ~[?:?]",
,"at org.elasticsearch.transport.TcpTransport.initiateConnection(TcpTransport.java:325) ~[elasticsearch-7.1.1.jar:7.1.1]",
,"at org.elasticsearch.transport.TcpTransport.openConnection(TcpTransport.java:292) ~[elasticsearch-7.1.1.jar:7.1.1]",
,"at org.elasticsearch.transport.ConnectionManager.internalOpenConnection(ConnectionManager.java:206) ~[elasticsearch-7.1.1.jar:7.1.1]",
,"at org.elasticsearch.transport.ConnectionManager.connectToNode(ConnectionManager.java:104) ~[elasticsearch-7.1.1.jar:7.1.1]",
,"at org.elasticsearch.transport.TransportService.connectToNode(TransportService.java:344) ~[elasticsearch-7.1.1.jar:7.1.1]",
,"at org.elasticsearch.transport.TransportService.connectToNode(TransportService.java:331) ~[elasticsearch-7.1.1.jar:7.1.1]",
,"at org.elasticsearch.cluster.NodeConnectionsService.validateAndConnectIfNeeded(NodeConnectionsService.java:153) [elasticsearch-7.1.1.jar:7.1.1]",
,"at org.elasticsearch.cluster.NodeConnectionsService$1.doRun(NodeConnectionsService.java:106) [elasticsearch-7.1.1.jar:7.1.1]",
,"at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:751) [elasticsearch-7.1.1.jar:7.1.1]",
,"at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.1.1.jar:7.1.1]",
,"at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]",
,"at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]",
,"at java.lang.Thread.run(Thread.java:835) [?:?]",
,"Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: 172.18.0.12/172.18.0.12:9300",
,"at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?]",
,"at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:779) ~[?:?]",
,"at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:327) ~[?:?]",
,"at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340) ~[?:?]",
,"at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) ~[?:?]",
,"at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:556) ~[?:?]",
,"at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:510) ~[?:?]",
,"at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:470) ~[?:?]",
,"at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:909) ~[?:?]",
,"... 1 more",
,"Caused by: java.net.ConnectException: Connection refused",
,"at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?]",
,"at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:779) ~[?:?]",
,"at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:327) ~[?:?]",
,"at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340) ~[?:?]",
,"at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) ~[?:?]",
,"at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:556) ~[?:?]",
,"at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:510) ~[?:?]",
,"at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:470) ~[?:?]",
,"at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:909) ~[?:?]",
,"... 1 more"] }
,{"type": "server", "timestamp": "2020-07-03T13:58:08,537+1000", "level": "WARN", "component": "o.e.t.OutboundHandler", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "send message failed [channel: Netty4TcpChannel{localAddress=0.0.0.0/0.0.0.0:50984, remoteAddress=null}]" ,
,"stacktrace": ["java.nio.channels.ClosedChannelException: null",
,"at io.netty.handler.ssl.SslHandler.channelInactive(...)(Unknown Source) ~[?:?]"] }
,{"type": "server", "timestamp": "2020-07-03T13:58:08,543+1000", "level": "INFO", "component": "o.e.x.s.a.AuthenticationService", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "Authentication of [elastic] was terminated by realm [reserved] - failed to authenticate user [elastic]" }

Hello Ranjit,

Welcome to this forum. Please format logs as code for better readability:
```
"o.e.x.m.e.l.LocalExporter", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "waiting for elected master node [{elastic-node-02.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net}{trZDZL-lQQ21eeRs8s3_QA}{Vl4qESV7Rv2p4MUnhAAeng}{172.18.0.12}{172.18.0.12:9300}{ml.machine_memory=128671608832, ml.max_open_jobs=20, xpack.installed=true, zone=zone1}] to setup local exporter [default_local] (does it have x-pack installed?)" }
```

Which then looks like:

"o.e.x.m.e.l.LocalExporter", "cluster.name": "elastic_cluster", "node.name": "elastic-node-01.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net", "cluster.uuid": "iOqnm-xHRbCYX_aUmhUjAA", "node.id": "oVR3E4hTR06YutJiKX7j8w", "message": "waiting for elected master node [{elastic-node-02.aws-jt-tt-30.abcd-wmd.syd.aws.onabcd.net}{trZDZL-lQQ21eeRs8s3_QA}{Vl4qESV7Rv2p4MUnhAAeng}{172.18.0.12}{172.18.0.12:9300}{ml.machine_memory=128671608832, ml.max_open_jobs=20, xpack.installed=true, zone=zone1}] to setup local exporter [default_local] (does it have x-pack installed?)" }

Please have a look here: https://www.elastic.co/guide/en/elasticsearch/reference/current/setup-upgrade.html

A version upgrade from 5.5 to 7.1 is a large difference so it is recommended to upgrade your version to 5.6 and then upgrade to 6.8. In version 6.8 there is a migration advisor in Kibana which tells you which parts have to be fixed before upgrading to 7.x.

Best regards
Wolfram

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.