Kibana: 405 response code on refreshing index pattern’s field list

Hi, all.

This is an issue I've seen posted previously in this forum but with no actual answer.

When trying to refresh the field list of an index pattern I get a 405 error response.
This the error message:

    Error: 405 Response
    value/<@https://www.logiweb.de/kibana-TSI/bundles/commons.bundle.js?v=16627:21:102329
    processQueue@https://www.logiweb.de/kibana-TSI/bundles/vendors.bundle.js?v=16627:116:132456
    scheduleProcessQueue/<@https://www.logiweb.de/kibana-TSI/bundles/vendors.bundle.js?v=16627:116:133361
    $digest@https://www.logiweb.de/kibana-TSI/bundles/vendors.bundle.js?v=16627:116:144239
    $apply@https://www.logiweb.de/kibana-TSI/bundles/vendors.bundle.js?v=16627:116:147018
    done@https://www.logiweb.de/kibana-TSI/bundles/vendors.bundle.js?v=16627:116:100026
    completeRequest@https://www.logiweb.de/kibana-TSI/bundles/vendors.bundle.js?v=16627:116:104697
    createHttpBackend/</xhr.onload@https://www.logiweb.de/kibana-TSI/bundles/vendors.bundle.js?v=16627:116:105435

Version of Kibana is 6.2.4
I would really appreciate any suggestions. Thank you.

Hello @MRC1 - do you see errors in the kibana and elasticsearch as well when this happens?

Does this problem occur reliably? On all index patterns?

Hi @mattkime. Thanks for responding.
No, the errors are only in kibana and yes, it does happen with all index patterns.

Do you see any errors in the Kibana server logs?

Hi, @mattkime, I've found this today in the elasticsearc log:

[2020-07-03T03:20:14,861][DEBUG][o.e.a.s.TransportSearchAction] [node-1] [other-prod-2020.07.03][0], node[ZRr7d6phQASV2sp7pCDpPw], [P], s[STARTED], a[id=YCIUgEskTRyGfE7jNBUWcw]: Failed to execute [SearchRequest{searchType=QUERY_THEN_FETCH, indices=[other-*], indicesOptions=IndicesOptions[id=39, ignore_unavailable=true, allow_no_indices=true, expand_wildcards_open=true, expand_wildcards_closed=false, allow_aliases_to_multiple_indices=true, forbid_closed_indices=true, ignore_aliases=false], types=[], routing='null', preference='1593738079026', requestCache=null, scroll=null, maxConcurrentShardRequests=5, batchedReduceSize=512, preFilterShardSize=6, source={"size":0,"query":{"bool":{"must":[{"match_all":{"boost":1.0}},{"query_string":{"query":"logType.keyword: mtfsconnector AND serverType.keyword: mtfs","default_field":"*","fields":[],"type":"best_fields","default_operator":"or","max_determinized_states":10000,"enable_position_increments":true,"fuzziness":"AUTO","fuzzy_prefix_length":0,"fuzzy_max_expansions":50,"phrase_slop":0,"analyze_wildcard":true,"escape":false,"auto_generate_synonyms_phrase_query":true,"fuzzy_transpositions":true,"boost":1.0}},{"query_string":{"query":"*","default_field":"*","fields":[],"type":"best_fields","default_operator":"or","max_determinized_states":10000,"enable_position_increments":true,"fuzziness":"AUTO","fuzzy_prefix_length":0,"fuzzy_max_expansions":50,"phrase_slop":0,"analyze_wildcard":true,"escape":false,"auto_generate_synonyms_phrase_query":true,"fuzzy_transpositions":true,"boost":1.0}},{"match_phrase":{"stage":{"query":"Production","slop":0,"boost":1.0}}},{"range":{"@timestamp":{"from":1593732009017,"to":1593739199999,"include_lower":true,"include_upper":true,"format":"epoch_millis","boost":1.0}}}],"adjust_pure_negative":true,"boost":1.0}},"version":true,"_source":{"includes":[],"excludes":[]},"stored_fields":"*","docvalue_fields":["@timestamp","logstash.processing.filterEnd","logstash.processing.filterStart","mtfs.time.parsedInConnector","mtfs.time.receivedByGwy","mtfs.time.retrievedFromActiveMQ","timestamp_status_auth_start","timestamp_status_created","timestamp_status_dl_start","timestamp_status_end","tkmMsgTimeSend"],"script_fields":{},"aggregations":{"2":{"date_histogram":{"field":"@timestamp","time_zone":"Europe/Berlin","interval":"1m","offset":0,"order":{"_key":"asc"},"keyed":false,"min_doc_count":1},"aggregations":{"3":{"filters":{"filters":{"ERROR":{"query_string":{"query":"loglevel.keyword: ERROR","default_field":"*","fields":[],"type":"best_fields","default_operator":"or","max_determinized_states":10000,"enable_position_increments":true,"fuzziness":"AUTO","fuzzy_prefix_length":0,"fuzzy_max_expansions":50,"phrase_slop":0,"analyze_wildcard":true,"escape":false,"auto_generate_synonyms_phrase_query":true,"fuzzy_transpositions":true,"boost":1.0}},"INFO":{"query_string":{"query":"loglevel.keyword: INFO","default_field":"*","fields":[],"type":"best_fields","default_operator":"or","max_determinized_states":10000,"enable_position_increments":true,"fuzziness":"AUTO","fuzzy_prefix_length":0,"fuzzy_max_expansions":50,"phrase_slop":0,"analyze_wildcard":true,"escape":false,"auto_generate_synonyms_phrase_query":true,"fuzzy_transpositions":true,"boost":1.0}},"WARN":{"query_string":{"query":"loglevel.keyword: WARN","default_field":"*","fields":[],"type":"best_fields","default_operator":"or","max_determinized_states":10000,"enable_position_increments":true,"fuzziness":"AUTO","fuzzy_prefix_length":0,"fuzzy_max_expansions":50,"phrase_slop":0,"analyze_wildcard":true,"escape":false,"auto_generate_synonyms_phrase_query":true,"fuzzy_transpositions":true,"boost":1.0}}},"other_bucket":false,"other_bucket_key":"_other_"}}}}}}}] lastShard [true]
org.elasticsearch.transport.RemoteTransportException: [node-1][139.1.117.41:19300][indices:data/read/search[phase/query]]
Caused by: org.elasticsearch.common.util.concurrent.EsRejectedExecutionException: rejected execution of org.elasticsearch.common.util.concurrent.TimedRunnable@52b4f9dd on QueueResizingEsThreadPoolExecutor[name = node-1/search, queue capacity = 1000, min queue capacity = 1000, max queue capacity = 1000, frame size = 2000, targeted response rate = 1s, task execution EWMA = 502nanos, adjustment amount = 50, org.elasticsearch.common.util.concurrent.QueueResizingEsThreadPoolExecutor@65f40481[Running, pool size = 13, active threads = 13, queued tasks = 1189, completed tasks = 1211465406]]
	at org.elasticsearch.common.util.concurrent.EsAbortPolicy.rejectedExecution(EsAbortPolicy.java:48) ~[elasticsearch-6.2.3.jar:6.2.3]
	at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830) ~[?:1.8.0_151]
	at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379) ~[?:1.8.0_151]
	at org.elasticsearch.common.util.concurrent.EsThreadPoolExecutor.doExecute(EsThreadPoolExecutor.java:98) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.common.util.concurrent.QueueResizingEsThreadPoolExecutor.doExecute(QueueResizingEsThreadPoolExecutor.java:88) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.common.util.concurrent.EsThreadPoolExecutor.execute(EsThreadPoolExecutor.java:93) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.search.SearchService.lambda$rewriteShardRequest$0(SearchService.java:994) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:60) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.index.query.Rewriteable.rewriteAndFetch(Rewriteable.java:113) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.index.query.Rewriteable.rewriteAndFetch(Rewriteable.java:86) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.search.SearchService.rewriteShardRequest(SearchService.java:992) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:312) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.SearchTransportService$6.messageReceived(SearchTransportService.java:372) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.SearchTransportService$6.messageReceived(SearchTransportService.java:369) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:66) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.transport.TransportService.sendLocalRequest(TransportService.java:650) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.transport.TransportService.access$000(TransportService.java:77) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.transport.TransportService$3.sendRequest(TransportService.java:138) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.transport.TransportService.sendRequestInternal(TransportService.java:598) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.transport.TransportService.sendRequest(TransportService.java:518) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.transport.TransportService.sendChildRequest(TransportService.java:558) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.transport.TransportService.sendChildRequest(TransportService.java:549) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.SearchTransportService.sendExecuteQuery(SearchTransportService.java:152) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.SearchQueryThenFetchAsyncAction.executePhaseOnShard(SearchQueryThenFetchAsyncAction.java:52) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.InitialSearchPhase.performPhaseOnShard(InitialSearchPhase.java:208) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.InitialSearchPhase.run(InitialSearchPhase.java:153) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.AbstractSearchAsyncAction.executePhase(AbstractSearchAsyncAction.java:146) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.AbstractSearchAsyncAction.start(AbstractSearchAsyncAction.java:116) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.TransportSearchAction$1.run(TransportSearchAction.java:393) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.AbstractSearchAsyncAction.executePhase(AbstractSearchAsyncAction.java:146) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.AbstractSearchAsyncAction.executeNextPhase(AbstractSearchAsyncAction.java:140) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.AbstractSearchAsyncAction.onPhaseDone(AbstractSearchAsyncAction.java:243) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.InitialSearchPhase.successfulShardExecution(InitialSearchPhase.java:251) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.InitialSearchPhase.onShardResult(InitialSearchPhase.java:239) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.InitialSearchPhase.access$200(InitialSearchPhase.java:49) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.InitialSearchPhase$2.lambda$innerOnResponse$0(InitialSearchPhase.java:212) ~[elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.action.search.InitialSearchPhase$1.doRun(InitialSearchPhase.java:184) [elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:672) [elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.common.util.concurrent.TimedRunnable.doRun(TimedRunnable.java:41) [elasticsearch-6.2.3.jar:6.2.3]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-6.2.3.jar:6.2.3]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_151]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_151]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_151]

Again, thanks for your help. :pray:
MRC1

@MRC1 That log appears to indicate a search request is failing. Do you know where that request is coming from? It doesn't appear to be directly associated with the kibana error you're seeing but if its causing performance problems with the cluster it might result in kibana errors as well.

Doing a little memory I think we were working on some big reports at that time which make me think that maybe the cluster was just overwhelmed and thats why there were a few failed search request in the log.

Usually we don't have does performances issues.

Fortunately I keep that days log file and there were only a couple of search request failed entries and nothing much noteworthy.

I tried again today triggering the error in Kibana and then check in the elasticsearch log for any errors at that time. No luck.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.