Default elastic user with superuser role has no access to APM API

Elasticsearch and Kibana are both running v7.6.2 on EKS using Helm

When logged in as the default elastic user with the superuser role, I am getting a 500 Internal Server Error when making requests to the APM API in the APM tab. Everything else works fine in Kibana:

I see that the elastic user has the superuser role:

› kubectl exec -it elastic-stack-coordinator-0 -c elastic-stack -- bash
[elasticsearch@elastic-stack-coordinator-0 ~]$ curl -k -u elastic:elastic https://elastic-stack-coordinator:9200/_security/role/superuser
{"superuser":{"cluster":["all"],"indices":[{"names":["*"],"privileges":["all"],"allow_restricted_indices":true}],"applications":[{"application":"*","privileges":["*"],"resources":["*"]}],"run_as":["*"],"metadata":{"_reserved":true},"transient_metadata":{}}}
[elasticsearch@elastic-stack-coordinator-0 ~]$ curl -k -u elastic:elastic https://elastic-stack-coordinator:9200/_security/user/elastic       
{"elastic":{"username":"elastic","roles":["superuser"],"full_name":null,"email":null,"metadata":{"_reserved":true},"enabled":true}}

I also see that .apm-agent-configuration exists:

[elasticsearch@elastic-stack-coordinator-0 ~]$ curl -k -u elastic:elastic https://elastic-stack-coordinator:9200/_cat/indices/.*
green open .apm-agent-configuration        Y5C6uLMwSL6nnw_k4GjG1Q 1 1     0     0    566b    283b

but when I go to Settings, I am met with the following error and the dropdown is empty so I cannot progress:

I am able to see the data from the APM Server in the Discover tab. This only happens in the test cluster while it's working fine in both dev (identical configuration) and prod clusters.

Just tried upgrading to v7.7.1 in the test EKS cluster but I am seeing the same thing.

Hello @danksim , can you share the corresponding server logs from Kibana?

Thanks @JLeysens

So this morning I upgraded to the latest available Elasticsearch helm chart version 7.17.3 and am now seeing a different error in the Kibana UI:
Screen Shot 2022-09-23 at 10.53.15 AM

Here are some logs when I tail the Kibana pod:

{"type":"log","@timestamp":"2022-09-23T14:54:09+00:00","tags":["error","plugins","apm"],"pid":7,"message":"WrappedElasticsearchClientError: search_phase_execution_exception: [illegal_argument_exception] Reason: Text fields are not optimised for operations that require per-document field data like aggregations and sorting, so these operations are disabled by default. Please use a keyword field instead. Alternatively, set fielddata=true on [service.name] in order to load field data by uninverting the inverted index. Note that this can use significant memory.\n    at /usr/share/kibana/x-pack/plugins/observability/common/utils/unwrap_es_response.js:60:11\n    at runMicrotasks (<anonymous>)\n    at processTicksAndRejections (node:internal/process/task_queues:96:5)\n    at getServiceTransactionStats (/usr/share/kibana/x-pack/plugins/apm/server/lib/services/get_services/get_service_transaction_stats.js:52:20)\n    at async Promise.all (index 0)\n    at /usr/share/kibana/x-pack/plugins/apm/server/lib/services/get_services/get_services_items.js:46:77\n    at async Promise.all (index 0)\n    at /usr/share/kibana/x-pack/plugins/apm/server/lib/services/get_services/index.js:31:36\n    at /usr/share/kibana/x-pack/plugins/apm/server/routes/register_routes/index.js:144:13\n    at Router.handle (/usr/share/kibana/src/core/server/http/router/router.js:163:30)\n    at handler (/usr/share/kibana/src/core/server/http/router/router.js:124:50)\n    at exports.Manager.execute (/usr/share/kibana/node_modules/@hapi/hapi/lib/toolkit.js:60:28)\n    at Object.internals.handler (/usr/share/kibana/node_modules/@hapi/hapi/lib/handler.js:46:20)\n    at exports.execute (/usr/share/kibana/node_modules/@hapi/hapi/lib/handler.js:31:20)\n    at Request._lifecycle (/usr/share/kibana/node_modules/@hapi/hapi/lib/request.js:371:32)\n    at Request._execute (/usr/share/kibana/node_modules/@hapi/hapi/lib/request.js:281:9)\nResponse: {\n  error: {\n    root_cause: [\n      {\n        type: 'illegal_argument_exception',\n        reason: 'Text fields are not optimised for operations that require per-document field data like aggregations and sorting, so these operations are disabled by default. Please use a keyword field instead. Alternatively, set fielddata=true on [service.name] in order to load field data by uninverting the inverted index. Note that this can use significant memory.'\n      }\n    ],\n    type: 'search_phase_execution_exception',\n    reason: 'all shards failed',\n    phase: 'query',\n    grouped: true,\n    failed_shards: [\n      {\n        shard: 0,\n        index: 'apm-index-000001',\n        node: 's_UIMdClQbW_AMrZ_yLBOQ',\n        reason: {\n          type: 'illegal_argument_exception',\n          reason: 'Text fields are not optimised for operations that require per-document field data like aggregations and sorting, so these operations are disabled by default. Please use a keyword field instead. Alternatively, set fielddata=true on [service.name] in order to load field data by uninverting the inverted index. Note that this can use significant memory.'\n        }\n      }\n    ],\n    caused_by: {\n      type: 'illegal_argument_exception',\n      reason: 'Text fields are not optimised for operations that require per-document field data like aggregations and sorting, so these operations are disabled by default. Please use a keyword field instead. Alternatively, set fielddata=true on [service.name] in order to load field data by uninverting the inverted index. Note that this can use significant memory.',\n      caused_by: {\n        type: 'illegal_argument_exception',\n        reason: 'Text fields are not optimised for operations that require per-document field data like aggregations and sorting, so these operations are disabled by default. Please use a keyword field instead. Alternatively, set fielddata=true on [service.name] in order to load field data by uninverting the inverted index. Note that this can use significant memory.'\n      }\n    }\n  },\n  status: 400\n}\n {\n  originalError: ResponseError: search_phase_execution_exception: [illegal_argument_exception] Reason: Text fields are not optimised for operations that require per-document field data like aggregations and sorting, so these operations are disabled by default. Please use a keyword field instead. Alternatively, set fielddata=true on [service.name] in order to load field data by uninverting the inverted index. Note that this can use significant memory.\n      at onBody (/usr/share/kibana/node_modules/@elastic/elasticsearch/lib/Transport.js:367:23)\n      at IncomingMessage.onEnd (/usr/share/kibana/node_modules/@elastic/elasticsearch/lib/Transport.js:291:11)\n      at IncomingMessage.emit (node:events:538:35)\n      at endReadableNT (node:internal/streams/readable:1345:12)\n      at processTicksAndRejections (node:internal/process/task_queues:83:21) {\n    meta: {\n      body: [Object],\n      statusCode: 400,\n      headers: [Object],\n      meta: [Object]\n    }\n  }\n}"}
...
{"type":"error","@timestamp":"2022-09-23T14:54:09+00:00","tags":[],"pid":7,"level":"error","error":{"message":"Internal Server Error","name":"Error","stack":"Error: Internal Server Error\n    at HapiResponseAdapter.toError (/usr/share/kibana/src/core/server/http/router/response_adapter.js:128:19)\n    at HapiResponseAdapter.toHapiResponse (/usr/share/kibana/src/core/server/http/router/response_adapter.js:82:19)\n    at HapiResponseAdapter.handle (/usr/share/kibana/src/core/server/http/router/response_adapter.js:73:17)\n    at Router.handle (/usr/share/kibana/src/core/server/http/router/router.js:164:34)\n    at runMicrotasks (<anonymous>)\n    at processTicksAndRejections (node:internal/process/task_queues:96:5)\n    at handler (/usr/share/kibana/src/core/server/http/router/router.js:124:50)\n    at exports.Manager.execute (/usr/share/kibana/node_modules/@hapi/hapi/lib/toolkit.js:60:28)\n    at Object.internals.handler (/usr/share/kibana/node_modules/@hapi/hapi/lib/handler.js:46:20)\n    at exports.execute (/usr/share/kibana/node_modules/@hapi/hapi/lib/handler.js:31:20)\n    at Request._lifecycle (/usr/share/kibana/node_modules/@hapi/hapi/lib/request.js:371:32)\n    at Request._execute (/usr/share/kibana/node_modules/@hapi/hapi/lib/request.js:281:9)"},"url":"https://localhost:5601/internal/apm/services?environment=ENVIRONMENT_ALL&kuery=&start=2022-09-23T14%3A39%3A00.000Z&end=2022-09-23T14%3A54%3A08.616Z","message":"Internal Server Error"}
...
{"type":"response","@timestamp":"2022-09-23T14:54:09+00:00","tags":["access:apm"],"pid":7,"method":"get","statusCode":500,"req":{"url":"/internal/apm/services?environment=ENVIRONMENT_ALL&kuery=&start=2022-09-23T14%3A39%3A00.000Z&end=2022-09-23T14%3A54%3A08.616Z","method":"get","headers":{"host":"localhost:5601","user-agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:104.0) Gecko/20100101 Firefox/104.0","accept":"*/*","accept-language":"en-US,en;q=0.5","accept-encoding":"gzip, deflate, br","referer":"https://localhost:5601/app/apm/services?rangeFrom=now-15m&rangeTo=now&comparisonEnabled=true&comparisonType=day","content-type":"application/json","kbn-version":"7.17.3","connection":"keep-alive","sec-fetch-dest":"empty","sec-fetch-mode":"cors","sec-fetch-site":"same-origin"},"remoteAddress":"127.0.0.1","userAgent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:104.0) Gecko/20100101 Firefox/104.0","referer":"https://localhost:5601/app/apm/services?rangeFrom=now-15m&rangeTo=now&comparisonEnabled=true&comparisonType=day"},"res":{"statusCode":500,"responseTime":796,"contentLength":514},"message":"GET /internal/apm/services?environment=ENVIRONMENT_ALL&kuery=&start=2022-09-23T14%3A39%3A00.000Z&end=2022-09-23T14%3A54%3A08.616Z 500 796ms - 514.0B"}

This does not look like any kind of permissions issue. This sounds like an issue caused by a custom index template or misconfiguration of APM.

See the detailed walkthrough over here: Failed to load resource: the server responded with a status of 500 (Internal Server Error) - #8 by caue.marcondes

Well yeah... Like I said, I had upgraded to the latest available helm chart in the meantime.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.