RemoteTransportException cannot be cast to class org.elasticsearch.index.IndexNotFoundException after upgrade to 7.11.2 from 7.6.0

Hello,

We recently upgraded our elasticsearch stack to 7.11.2 from 7.6.0 and are seeing an issue when attempting PUTS on indexes that don't exist:

"message": "failed to handle exception response [org.elasticsearch.transport.TransportService$ContextRestoreResponseHandler/org.elasticsearch.transport.TransportService$6/[indices:admin/auto_create]:org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$1@5cbcbb03/org.elasticsearch.action.support.TransportAction$1@63c6e8ba]", "cluster.uuid": "WZjUh9hSRyK6IkZwNMF8rQ", "node.id": "jn7PgJoRRdCE3WbC4x2J8A"
"stacktrace": ["java.lang.ClassCastException: class org.elasticsearch.transport.RemoteTransportException cannot be cast to class org.elasticsearch.index.IndexNotFoundException (org.elasticsearch.transport.RemoteTransportException and org.elasticsearch.index.IndexNotFoundException are in unnamed module of loader 'app')",
"at org.elasticsearch.action.bulk.TransportBulkAction$1.onFailure(TransportBulkAction.java:263) ~[elasticsearch-7.11.2.jar:7.11.2]",
at "at org.elasticsearch.action.support.TransportAction$1.onFailure(TransportAction.java:92) ~[elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$1.handleException(TransportMasterNodeAction.java:177) ~[elasticsearch-7.11.2.jar:7.11.2]",

Add a template parameter to override auto_create_index value by pugnascotia · Pull Request #61858 · elastic/elasticsearch · GitHub Potentially related to this PR? We currently use an elastic search yml to define auto create index rules that are disallowing these extraneous indexes to be created
elasticsearch.yml: |
action.auto_create_index: ".watches,.triggered_watches,.watcher-history*,.kibana*,.monitoring*"

Before we updated we would simply get a 404 response back indicating that the index isn't found and to simply move on, but with these class cast exceptions we're not getting that behavior.

1 Like
{"type": "server", "timestamp": "2021-03-23T16:50:15,684Z", "level": "ERROR", "component": "o.e.t.InboundHandler", "cluster.name": "ao-elasticsearch", "node.name": "ao-elasticsearch-master-7", "message": "failed to handle exception response [org.elasticsearch.transport.TransportService$ContextRestoreResponseHandler/org.elasticsearch.transport.TransportService$6/[indices:admin/auto_create]:org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$1@1c563eb5/org.elasticsearch.action.support.TransportAction$1@4b9bd095]", "cluster.uuid": "WZjUh9hSRyK6IkZwNMF8rQ", "node.id": "X2ohsBJ1Quyw7PuDwLhdjw" ,
"stacktrace": ["java.lang.ClassCastException: class org.elasticsearch.transport.RemoteTransportException cannot be cast to class org.elasticsearch.index.IndexNotFoundException (org.elasticsearch.transport.RemoteTransportException and org.elasticsearch.index.IndexNotFoundException are in unnamed module of loader 'app')",
"at org.elasticsearch.action.bulk.TransportBulkAction$1.onFailure(TransportBulkAction.java:263) ~[elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.action.support.TransportAction$1.onFailure(TransportAction.java:92) ~[elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.action.support.master.TransportMasterNodeAction$AsyncSingleAction$1.handleException(TransportMasterNodeAction.java:177) ~[elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.transport.TransportService$6.handleException(TransportService.java:743) ~[elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.transport.TransportService$ContextRestoreResponseHandler.handleException(TransportService.java:1288) ~[elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.transport.InboundHandler.lambda$handleException$3(InboundHandler.java:266) [elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.common.util.concurrent.EsExecutors$DirectExecutorService.execute(EsExecutors.java:213) [elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.transport.InboundHandler.handleException(InboundHandler.java:264) [elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.transport.InboundHandler.handlerResponseError(InboundHandler.java:256) [elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.transport.InboundHandler.messageReceived(InboundHandler.java:120) [elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.transport.InboundHandler.inboundMessage(InboundHandler.java:78) [elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.transport.TcpTransport.inboundMessage(TcpTransport.java:689) [elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.transport.InboundPipeline.forwardFragments(InboundPipeline.java:131) [elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.transport.InboundPipeline.doHandleBytes(InboundPipeline.java:106) [elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.transport.InboundPipeline.handleBytes(InboundPipeline.java:71) [elasticsearch-7.11.2.jar:7.11.2]",
"at org.elasticsearch.transport.netty4.Netty4MessageChannelHandler.channelRead(Netty4MessageChannelHandler.java:63) [transport-netty4-client-7.11.2.jar:7.11.2]",
"at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:271) [netty-handler-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:615) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:578) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) [netty-transport-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) [netty-common-4.1.49.Final.jar:4.1.49.Final]",
"at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.49.Final.jar:4.1.49.Final]",
"at java.lang.Thread.run(Thread.java:832) [?:?]"] }

Hey @Brady_Davis ,
I also facing this issue. did you solve it?

We got around this by updating to 7.12. Perhaps it's a bug in 7.11.2? Try updating to a more recent version.

It was, and it's fixed by this PR: