Update: I've resolved the CSP and 403 errors I saw, but now am being blocked by this:
https://elk-1.XXXXX.com/internal/data_views/_fields_for_wildcard?pattern=ecs-logstash-*&meta_fields=_source&meta_fields=_id&meta_fields=_index&meta_fields=_score&allow_no_index=true
[Error] Failed to load resource: the server responded with a status of 404 (Not Found) (_fields_for_wildcard, line 0)
Also, I'm now back on version 8.12.2.
Following a hasty Kibana upgrade (to 8.12.2) and rollback (to 8.5.3), I'm able to run Kibana without errors in the Kibana logs. However my Elasticsearch indexes (from LogStash) are not displaying. For example, when I attempt to view the Data View for those indexes, I receive what looks like a CSP error in my browser, and I'm unable to assign a timestamp for that Data View. Any pointers to further troubleshoot this?
Bumping this and would really appreciate any help I might get. I've been digging quite a bit and have some further clues, but I'm not sure where to go from here.
Basic issue: When creating a Data View, no fields appear to select from as the Timestamp field. Though in Discover, all fields display and are mapped correctly (@timestamp is a date field). (And this had been all working before.)
Some findings:
As far as I can tell, all other Kibana functions work fine.
There is another Data View that's attached to a much older index, and that works fine.
The problematic index (ecs-logstash-*) appears fine and is still ingesting data from Logstash without issue.
When opening the Data View management page, I see this error:
[2024-03-25T08:02:35.565-07:00][DEBUG][elasticsearch.query.data] [RequestAbortedError]: The content length (536879546) is bigger than the maximum allowed string (536870888)
Even though I have this set in kibana.yml:
server.maxPayload: 1938946807
I'm seeing lots of these in general:
[2024-03-25T06:55:03.407-07:00][DEBUG][http.server.response] GET /api/settings?extended=true&legacy=true 404 6ms - 60.0B
Other logs as I'm attempting to add the Data View:
[2024-03-25T06:55:02.755-07:00][DEBUG][elasticsearch.query.data] 200 - 469.0B
GET /_nodes?filter_path=nodes.*.version%2Cnodes.*.http.publish_address%2Cnodes.*.ip
[2024-03-25T06:55:03.130-07:00][DEBUG][elasticsearch.query.data] [RequestAbortedError]: The content length (536879546) is bigger than the maximum allowed string (536870888)
[2024-03-25T06:55:03.131-07:00][DEBUG][plugins.licensing] Requesting Elasticsearch licensing API
[2024-03-25T06:55:03.143-07:00][DEBUG][elasticsearch.query.data] 200 - 1.4KB
GET /_xpack
[2024-03-25T06:55:03.145-07:00][DEBUG][http.server.response] GET /internal/data_views/_fields_for_wildcard?pattern=ecs-logstash-*&meta_fields=_source&meta_fields=_id&meta_fields=_index&meta_fields=_score&allow_hidden=false 404 10221ms - 194.0B
[2024-03-25T06:55:03.384-07:00][DEBUG][http.server.Kibana.cookie-session-storage] Error: Unauthorized
[2024-03-25T06:55:03.384-07:00][DEBUG][plugins.security.basic.basic] Trying to authenticate user request to /api/status.
[2024-03-25T06:55:03.385-07:00][DEBUG][plugins.security.basic.basic] Cannot authenticate requests with `Authorization` header.
[2024-03-25T06:55:03.385-07:00][DEBUG][plugins.security.http] Trying to authenticate user request to /api/status.
[2024-03-25T06:55:03.387-07:00][DEBUG][elasticsearch.query.data] 200 - 353.0B
GET /_security/_authenticate
[redacted]
[2024-03-25T06:55:03.387-07:00][DEBUG][plugins.security.http] Request to /api/status has been authenticated via authorization header with "Basic" scheme.
[2024-03-25T06:55:03.391-07:00][DEBUG][http.server.response] GET /api/status 200 7ms - 18.1KB
[2024-03-25T06:55:03.392-07:00][DEBUG][http.server.Kibana.cookie-session-storage] Error: Unauthorized
[2024-03-25T06:55:03.392-07:00][DEBUG][plugins.security.basic.basic] Trying to authenticate user request to /api/stats?extended=true&legacy=true&exclude_usage=true.
[2024-03-25T06:55:03.392-07:00][DEBUG][plugins.security.basic.basic] Cannot authenticate requests with `Authorization` header.
[2024-03-25T06:55:03.392-07:00][DEBUG][plugins.security.http] Trying to authenticate user request to /api/stats?extended=true&legacy=true&exclude_usage=true.
[2024-03-25T06:55:03.394-07:00][DEBUG][elasticsearch.query.data] 200 - 353.0B
GET /_security/_authenticate
[redacted]
[2024-03-25T06:55:03.394-07:00][DEBUG][plugins.security.http] Request to /api/stats?extended=true&legacy=true&exclude_usage=true has been authenticated via authorization header with "Basic" scheme.
[2024-03-25T06:55:03.397-07:00][DEBUG][elasticsearch.query.data] 200 - 48.0B
GET /?filter_path=cluster_uuid
[2024-03-25T06:55:03.400-07:00][DEBUG][http.server.response] GET /api/stats?extended=true&legacy=true&exclude_usage=true 200 9ms - 2.4KB
[2024-03-25T06:55:03.401-07:00][DEBUG][http.server.Kibana.cookie-session-storage] Error: Unauthorized
[2024-03-25T06:55:03.401-07:00][DEBUG][plugins.security.basic.basic] Trying to authenticate user request to /api/settings?extended=true&legacy=true.
[2024-03-25T06:55:03.401-07:00][DEBUG][plugins.security.basic.basic] Cannot authenticate requests with `Authorization` header.
[2024-03-25T06:55:03.402-07:00][DEBUG][plugins.security.http] Trying to authenticate user request to /api/settings?extended=true&legacy=true.
[2024-03-25T06:55:03.403-07:00][DEBUG][elasticsearch.query.data] 200 - 353.0B
GET /_security/_authenticate
[redacted]
[2024-03-25T06:55:03.403-07:00][DEBUG][plugins.security.http] Request to /api/settings?extended=true&legacy=true has been authenticated via authorization header with "Basic" scheme.
[2024-03-25T06:55:03.404-07:00][DEBUG][plugins.licensing] Requesting Elasticsearch licensing API
[2024-03-25T06:55:03.406-07:00][DEBUG][elasticsearch.query.data] 200 - 1.4KB
GET /_xpack
[2024-03-25T06:55:03.407-07:00][DEBUG][http.server.response] GET /api/settings?extended=true&legacy=true 404 6ms - 60.0B
[2024-03-25T06:55:04.546-07:00][DEBUG][elasticsearch.query.data] 200 - 227.0B
POST /.kibana_task_manager/_update_by_query?ignore_unavailable=true&refresh=true
I've successfully created a new Data View by using this filter string:
ecs-logstash-2024*
Instead of:
ecs-logstash-*
This obviously only give me this year's data and I have retain 1 year, so I would really like to get it all in one view. This does point to then, some data size limitation. I'll keep looking...
I understand now that the 536870888 byte limit is a limitation of node and can't really be changed.
My next question then is, what data is consuming these bytes? Is it the number of fields in the index? The amount of data in a field? The cumulative size of the indices wild carded? The number of indexes aliased?
And given that answer, how can those bytes be reduced?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.