Fail to discover index - SOLVED

created a small index with a @timestamp and three string fields, added couple of docs, picked up the index in K successfully, only trying to discover my docs in K fails with the error: 'Discover: Unable to parse/serialize body'
Wondering why, hints appreciated, TIA!

mapping:

# curl -XGET http://localhost:9200/annotation/_mapping/event?pretty
{
  "annotation" : {
    "mappings" : {
      "event" : {
        "_all" : {
          "enabled" : true,
          "omit_norms" : true
        },
        "dynamic_templates" : [ {
          "template1" : {
            "mapping" : {
              "norms" : {
                "enabled" : false
              },
              "ignore_above" : 64,
              "index" : "not_analyzed",
              "omit_norms" : true,
              "type" : "{dynamic_type}",
              "doc_values" : true
            },
            "match" : "*"
          }
        } ],
        "properties" : {
          "@timestamp" : {
            "type" : "date",
            "format" : "strict_date_optional_time||epoch_millis"
          },
          "descr" : {
            "type" : "string",
            "index" : "not_analyzed"
          },
          "tags" : {
            "type" : "string",
            "norms" : {
              "enabled" : false
            },
            "analyzer" : "whitespace"
          },
          "title" : {
            "type" : "string",
            "index" : "not_analyzed"
          }
        }
      }
    }
  }
}

a few docs:

# curl -XGET http://localhost:9200/annotation/_search?pretty -d '{query:{match_all:{}}}'
{
  "took" : 1,
  "timed_out" : false,
  "_shards" : {
    "total" : 5,
    "successful" : 5,
    "failed" : 0
  },
  "hits" : {
    "total" : 2,
    "max_score" : 1.0,
    "hits" : [ {
      "_index" : "annotation",
      "_type" : "event",
      "_id" : "2",
      "_score" : 1.0,
      "_source" : {
        "@timestamp" : "2016-07-10T08:16:21.000Z",
        "title" : "this is a test deployment",
        "tags" : "mx9 deploy",
        "descr" : "this is the description of the deployment"
      }
    }, {
      "_index" : "annotation",
      "_type" : "event",
      "_id" : "1",
      "_score" : 1.0,
      "_source" : {
        "@timestamp" : "2016-07-09T23:06:21.000Z",
        "title" : "test title 4",
        "tags" : "patch mx9",
        "descr" : "this is the description of the patch"
      }
    } ]
  }
}

There should be a button on the error message to show more info, could you click that and paste the stack trace here?

Also what version of Kibana/Elasticsearch are you using?

Kibana 4.5.2, ES 2.3.4

Error: Unable to parse/serialize body
ErrorAbstract@https://<redacted>/bundles/kibana.bundle.js?v=9896:62835:29
Serialization@https://<redacted>/bundles/kibana.bundle.js?v=9896:62900:22
respond@https://<redacted>/bundles/kibana.bundle.js?v=9896:64182:42
checkRespForFailure@https://<redacted>/bundles/kibana.bundle.js?v=9896:64165:15
https://<redacted>/bundles/kibana.bundle.js?v=9896:62780:8
processQueue@https://<redacted>/bundles/commons.bundle.js?v=9896:41864:31
https://<redacted>/bundles/commons.bundle.js?v=9896:41880:40
$eval@https://<redacted>/bundles/commons.bundle.js?v=9896:43108:29
$digest@https://<redacted>/bundles/commons.bundle.js?v=9896:42919:37
$apply@https://<redacted>/bundles/commons.bundle.js?v=9896:43216:32
done@https://<redacted>/bundles/commons.bundle.js?v=9896:37665:54
completeRequest@https://<redacted>/bundles/commons.bundle.js?v=9896:37863:16
requestLoaded@https://<redacted>/bundles/commons.bundle.js?v=9896:37804:25

Looks like the elasticsearch js client failed to parse an Elasticsearch response, do you see any failed requests in the network tab of your browser dev tools? Or if none of the requests themselves failed, perhaps you can see which one is being processed when the error occurs.

This is the only request in the Network tab (_msearch):

{"index":["annotation"],"ignore_unavailable":true}
{"size":500,"sort":[{"@timestamp":{"order":"desc","unmapped_type":"boolean"}}],"query":{"filtered":{"query":{"query_string":{"analyze_wildcard":true,"query":"*"}},"filter":{"bool":{"must":[{"range":{"@timestamp":{"gte":1467792171807,"lte":1468396971807,"format":"epoch_millis"}}}],"must_not":[]}}}},"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"*":{}},"require_field_match":false,"fragment_size":2147483647},"aggs":{"2":{"date_histogram":{"field":"@timestamp","interval":"3h","time_zone":"Europe/Berlin","min_doc_count":0,"extended_bounds":{"min":1467792171806,"max":1468396971806}}}},"fields":["*","_source"],"script_fields":{},"fielddata_fields":["@timestamp"]}

Its response from Network tab (suppressed multiple empty buckets here: ...):

{
    "responses": [{
        "took": 3,
        "timed_out": false,
        "_shards": {
            "total": 5,
            "successful": 5,
            "failed": 0
        },
        "hits": {
            "total": 2,
            "max_score": null,
            "hits": [{
                "_index": "annotation",
                "_type": "event",
                "_id": "2",
                "_score": null,
                "_source": {
                    @timestamp: "2016-07-10T08:16:21.000Z",
                    title: "dette er en test deployment",
                    tags: "mx9 deploy",
                    descr: "this is the description of the deployment"
                },
                "fields": {
                    "@timestamp": [1468138581000]
                },
                "sort": [1468138581000]
            }, {
                "_index": "annotation",
                "_type": "event",
                "_id": "1",
                "_score": null,
                "_source": {
                    @timestamp: "2016-07-09T23:06:21.000Z",
                    title: "test title 4",
                    tags: "patch mx9",
                    descr: "this is the description of the patch"
                },
                "fields": {
                    "@timestamp": [1468105581000]
                },
                "sort": [1468105581000]
            }
            ]
        },
        "aggregations": {
            "2": {
                "buckets": [{
                    "key_as_string": "2016-07-06T09:00:00.000+02:00",
                    "key": 1467788400000,
                    "doc_count": 0
                }, {
                    "key_as_string": "2016-07-06T12:00:00.000+02:00",
                    "key": 1467799200000,
                    "doc_count": 0
                ...
                }, {
                    "key_as_string": "2016-07-10T00:00:00.000+02:00",
                    "key": 1468101600000,
                    "doc_count": 1
                }, {
                    "key_as_string": "2016-07-10T03:00:00.000+02:00",
                    "key": 1468112400000,
                    "doc_count": 0
                }, {
                    "key_as_string": "2016-07-10T06:00:00.000+02:00",
                    "key": 1468123200000,
                    "doc_count": 0
                }, {
                    "key_as_string": "2016-07-10T09:00:00.000+02:00",
                    "key": 1468134000000,
                    "doc_count": 1
                }, {
                    "key_as_string": "2016-07-10T12:00:00.000+02:00",
                    "key": 1468144800000,
                    "doc_count": 0
               ...
                }, {
                    "key_as_string": "2016-07-13T09:00:00.000+02:00",
                    "key": 1468393200000,
                    "doc_count": 0
                }
                ]
            }
        }
    }
    ]
}

Hmmm, to be honest I'm a bit stumped. I just recreated your index with the provided mappings and sample data, and I'm not able to reproduce the error. Do you think you could provide exact steps to reproduce the issue, starting with a fresh install of Kibana and ES?

Here's another wild guess: do you happen to be using a proxy? Sometimes proxies can strip out parts of the request/response and cause odd errors like this.

I've got the same issue. If I hit the Discover view for the index with no filter/query, everything looks good. But if I filter it down to one specific type ("_type:vm"), I get this error. Presumably this might be due to the data Kibana thinks is malformed not being present in the first batch fetched when it pulls back the full dataset?

Started happening when on kibana 4.5.0, still happening after an upgrade to 4.5.3.

Unfortunately I can't share the full response from /elasticsearch/msearch (sensitive data).

Perhaps you could run the search results though a JSON linter like https://github.com/zaach/jsonlint ?

Should be as simple as using curl to recreate the query Kibana is breaking on and piping the results to jsonlint.

Not using a proxy here, connecting to a ES tribe node on localhost

Any one else got a hint for this?

@stefws could you try running the failing ES response through a JSON linter like I mentioned above? I'm wondering if you have invalid JSON in your docs which Kibana is barfing on.

jq doesn't complain, but then again this CLI query is not quite the exact same query as K send, how to extract this... directly from the response in the network tab maybe...

curl -sXGET http://62.243.41.175:9200/annotation/_search?pretty -d '{query:{match_all:{}}}' | jq .

{
  "took": 2,
  "timed_out": false,
  "_shards": {
    "total": 5,
    "successful": 5,
    "failed": 0
  },
  "hits": {
    "total": 2,
    "max_score": 1,
    "hits": [
      {
        "_index": "annotation",
        "_type": "event",
        "_id": "2",
        "_score": 1,
        "_source": {
          "@timestamp": "2016-07-10T08:16:21.000Z",
          "title": "dette er en test deployment",
          "tags": "mx9 deploy",
          "descr": "this is the description of the deployment"
        }
      },
      {
        "_index": "annotation",
        "_type": "event",
        "_id": "1",
        "_score": 1,
        "_source": {
          "@timestamp": "2016-07-09T23:06:21.000Z",
          "title": "test title 4",
          "tags": "patch mx9",
          "descr": "this is the description of the patch"
        }
      }
    ]
  }
}

When cNp'ing the response of the _msearch from the network tab jq says this:

# cat /tmp/resp | jq .
parse error: Invalid numeric literal at line 19, column 31

line 19 contains:

@timestamp: "2016-07-10T08:16:21.000Z",

':' after @timestamp being in column 31

If I quote all field names of both "_source" values in the respons, like from my CLI _search respons, then jq is happy, like this

        "_source": {
          "@timestamp": "2016-07-10T08:16:21.000Z",
          "title": "dette er en test deployment",
          "tags": "mx9 deploy",
          "descr": "this is the description of the deployment"
        },

but whether or not that's the issue for K and whether this it due a failed ES mapping/creation of the index and/or inserted documents I dunno

Ah, so is @timestamp unquoted in one of your source documents? That could cause the issue, JSON requires double quotes around field names.

like this non pretty _search response:

# curl -sXGET http://62.243.41.175:9200/annotation/_search -d '{query:{match_all:{}}}' 
{"took":1,"timed_out":false,"_shards":{"total":5,"successful":5,"failed":0},"hits":{"total":2,"max_score":1.0,"hits":[{"_index":"annotation","_type":"event","_id":"2","_score":1.0,"_source":{@timestamp:"2016-07-10T08:16:21.000Z", title:"dette er en test deployment", tags:"mx9 deploy", descr:"this is the description of the deployment"}},{"_index":"annotation","_type":"event","_id":"1","_score":1.0,"_source":{@timestamp:"2016-07-09T23:06:21.000Z", title:"test title 4", tags:"patch mx9", descr:"this is the description of the patch"}}]}}

meaning is my document data that invalid?

Hm or more properly my index creation/mapping... how do I avoid field names without quotes?

Arh, maybe I need to escape the quotes on document creation, like:

# curl -XPUT "http://localhost:9200/annotation/event/3" -d "{\"@timestamp\":\"2016-08-03T23:21:56.000Z\", \"title\":\"test patch title\", \"tags\":\"patch something\", \"descr\":\"this is the description of the patch\"}"
{"_index":"annotation","_type":"event","_id":"3","_version":2,"_shards":{"total":2,"successful":2,"failed":0},"created":false}

then non-pretty returns quoted field names:

# curl -XGET "http://localhost:9200/annotation/_search" -d '{query:{match_all:{}}}'
{"took":2,"timed_out":false,"_shards":{"total":5,"successful":5,"failed":0},"hits":{"total":1,"max_score":1.0,"hits":[{"_index":"annotation","_type":"event","_id":"3","_score":1.0,"_source":{"@timestamp":"2016-08-03T23:21:56.000Z", "title":"test patch title", "tags":"patch something", "descr":"this is the description of the patch"}}]}}

and then K works :slight_smile: and will discover documents, thanks all hints/help!

Awesome! Glad to help. Sorry it took so long to find a solution, I'll have to remember that one the next time I see this error pop up!