Kibana showing "invalid Zlib data" after upgrading to 7.14

I have an elastic server as. an "all-in-one" with filebeat->Logstash->Elastic flow. I use Cerebro to check status.

The data was visible and working well prior to upgrading to 7.14, but afterward I see Elastic is working well, data is ingesting and Cerebro can perform searches and get proper responses.

Kibana can do stack monitoring and management, but discovery gives me errors, stating "Invalid Zlib data". All my dashboards fail with the same error.

I cannot find any references in my searches to this problem. There was 1 person with this error here and here was the fix: "Got it fixed, the problem was the .async-search index was red. Deleted the index and everything sorted out."
Mine was not red, but I did delete and it rebuilt without a change.

I just took a Kibana discovery search that was not working and put it in Cerebro's rest API and was able to get the results I was expecting.

Here is my search (looking for all events from the last minute):

{
  "track_total_hits": true,
  "size": 500,
  "sort": [
    {
      "@timestamp": {
        "order": "desc",
        "unmapped_type": "boolean"
      }
    }
  ],
  "version": true,
  "fields": [
    {
      "field": "*",
      "include_unmapped": "true"
    },
    {
      "field": "@timestamp",
      "format": "strict_date_optional_time"
    },
    {
      "field": "event.created",
      "format": "strict_date_optional_time"
    },
    {
      "field": "event.end",
      "format": "strict_date_optional_time"
    },
    {
      "field": "event.start",
      "format": "strict_date_optional_time"
    },
    {
      "field": "file.accessed",
      "format": "strict_date_optional_time"
    },
    {
      "field": "file.changed",
      "format": "strict_date_optional_time"
    },
    {
      "field": "file.created",
      "format": "strict_date_optional_time"
    },
    {
      "field": "file.ctime",
      "format": "strict_date_optional_time"
    },
    {
      "field": "file.modified",
      "format": "strict_date_optional_time"
    },
    {
      "field": "file.mtime",
      "format": "strict_date_optional_time"
    }
  ],
  "aggs": {
    "2": {
      "date_histogram": {
        "field": "@timestamp",
        "fixed_interval": "1s",
        "time_zone": "America/Los_Angeles",
        "min_doc_count": 1
      }
    }
  },
  "script_fields": {
    "pcap.query": {
      "script": {
        "source": "if (doc['event.category'].value == 'network') {\n  if (doc['client.ip'].size() != 0 && doc['client.port'].size() != 0 && doc['server.ip'].size() != 0 && doc['server.port'].size() != 0) {\n    String api_endpoint = '/app/docket/api/uri/';\n    String host1 = doc['client.ip'].value;\n    String host2 = doc['server.ip'].value;\n    String port1 = doc['client.port'].value.toString();\n    String port2 = doc['server.port'].value.toString();\n    String timestamp = doc['@timestamp'].value.toString();\n    timestamp = timestamp.substring(0,timestamp.indexOf('Z'));\n    String begin = timestamp.substring(0,timestamp.indexOf('.')) + 'Z';\n    \n    return     api_endpoint + \n              'host/' + host1 + '/port/' + port1 +\n              '/host/' + host2 + '/port/' + port2 + \n              '/after/' + begin + '/';\n  }\n  else if (doc['source.ip'].size() != 0 && doc['source.port'].size() != 0 && doc['destination.ip'].size() != 0 && doc['destination.port'].size() != 0) {\n    String api_endpoint = '/app/docket/api/uri/';\n    String host1 = doc['source.ip'].value;\n    String host2 = doc['destination.ip'].value;\n    String port1 = doc['source.port'].value.toString();\n    String port2 = doc['destination.port'].value.toString();\n    String timestamp = doc['@timestamp'].value.toString();\n    timestamp = timestamp.substring(0,timestamp.indexOf('Z'));\n    String begin = timestamp.substring(0,timestamp.indexOf('.')) + 'Z';\n    \n    return     api_endpoint + \n              'host/' + host1 + '/port/' + port1 +\n              '/host/' + host2 + '/port/' + port2 + \n              '/after/' + begin + '/';\n  }\n  return '';\n}\nreturn '';",
        "lang": "painless"
      }
    }
  },
  "stored_fields": [
    "*"
  ],
  "runtime_mappings": {},
  "_source": false,
  "query": {
    "bool": {
      "must": [],
      "filter": [
        {
          "range": {
            "@timestamp": {
              "gte": "2021-08-27T17:16:11.723Z",
              "lte": "2021-08-27T17:17:11.723Z",
              "format": "strict_date_optional_time"
            }
          }
        }
      ],
      "should": [],
      "must_not": []
    }
  },
  "highlight": {
    "pre_tags": [
      "@kibana-highlighted-field@"
    ],
    "post_tags": [
      "@/kibana-highlighted-field@"
    ],
    "fields": {
      "*": {}
    },
    "fragment_size": 2147483647
  }
}

Kibana says:
1 request was made, 1 had a failure

Request:

data

438ms

This request queries Elasticsearch to fetch the data for the search.

Search session id: 21a9161d-644c-424f-be79-310a0f06f3d4


But Cerebro returns the expected records.

What can I do to either fix this in Kibana or diagnose to find the actual issue?

Thank you.

I was also able to perform the same search from Kibana's dev tools. But discover and dashboards do not work.

Thanks.

From what version did you upgrade to 7.14 ? What do the kibana and ES logs say ? Screenshots also would help . We do not support 3rd party cerebro APIs. Can you also the network configuration issue and see if you are not reaching Kibana.

I located the issue. I am running behind lighttpd and it seems to be a problem with lighttpd parsing the responses.

I shifted to nginx and Kibana is working again.

Something wrong with lighttpd proxy pass on /internal/bsearch. I found a 502 bad gateway error in my browser developer tools that pointed me to a problem not seen in the Kibana logs. Kibana was returning data and lighttpd was not able to pass the results to the browser...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.