No matching token for number_type [BIG_INTEGER]

  • Stopped our logstash indexer
  • Dropped today's index
  • Altered the template inorder to set type of a specific previous dynamic field as a string type, since it some time caused index to be created with this dynamic fieled as a long and thus maybe gave field type conflicts.
  • Started the logstash indexer
  • Got a new index of today auto created
  • Verified field now is mapped as a string
  • Went to Kibana and got this error:

Error: Request to Elasticsearch failed: {"error":{"root_cause":[{"type":"illegal_state_exception","reason":"No matching token for number_type [BIG_INTEGER]"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query_fetch","grouped":true,"failed_shards":[{"shard":0,"index":"owmlog-2016.04.29","node":"gSLxsnTgS5iPcZtuhdlC3A","reason":{"type":"illegal_state_exception","reason":"No matching token for number_type [BIG_INTEGER]"}}]}}

Tried to drop and recreate index a few times with same result, why?

Wondering if the BIG_NUMBER is a Kibana issue or ES, any ideas?

Kibana 4.5.0 only sees 4 fields as numbers from ES:2.3.1

nodePort number Edit
port number Edit
time number Edit
_score number Edit

How could I dig further down to find the data causing this?

Seems to have disappear again after patching all my ES nodes to 2.3.2-1 inlcuding my tribe node on the kibana host :relieved:

A bit to soon to cheer :confused:, it appeared again, hints on how to drill down to figure out why this is triggered appreciated!

Error: Request to Elasticsearch failed: {"error":{"root_cause":[{"type":"illegal_state_exception","reason":"No matching token for number_type [BIG_INTEGER]"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query_fetch","grouped":true,"failed_shards":[{"shard":0,"index":"popserv-2016.05.03","node":"lfWys2RkT9uuTCAz0lcmww","reason":{"type":"illegal_state_exception","reason":"No matching token for number_type [BIG_INTEGER]"}}]}}
KbnError@https:///bundles/commons.bundle.js?v=9889:61319:30
RequestFailure@https:///bundles/commons.bundle.js?v=9889:61352:19
https:///bundles/kibana.bundle.js?v=9889:89385:57
https:///bundles/commons.bundle.js?v=9889:63863:28
https:///bundles/commons.bundle.js?v=9889:63832:31
map@[native code]
map@https:///bundles/commons.bundle.js?v=9889:63831:34
callResponseHandlers@https:///bundles/kibana.bundle.js?v=9889:89357:26
https:///bundles/kibana.bundle.js?v=9889:88862:37
processQueue@https:///bundles/commons.bundle.js?v=9889:41836:31
https:///bundles/commons.bundle.js?v=9889:41852:40
$eval@https:///bundles/commons.bundle.js?v=9889:43080:29
$digest@https:///bundles/commons.bundle.js?v=9889:42891:37
$apply@https:///bundles/commons.bundle.js?v=9889:43188:32
done@https:///bundles/commons.bundle.js?v=9889:37637:54
completeRequest@https:///bundles/commons.bundle.js?v=9889:37835:16
requestLoaded@https:///bundles/commons.bundle.js?v=9889:37776:25

Nailed it done to documents like this:

 {"_index":"popserv-2016.05.25","_type":"popserv","_id":"AVTnEhpwWvfOjJPMxwwj","_score":1.2420747,
  "_source":
    {"message":"<redacted>","@timestamp":"2016-05-25T08:39:57.292Z","tags":[],
     "type":"popserv","host":"<redacted>","lglvl":"Note","event":"PopConnMade","user":<redacted>,
     "mbox":9xxxxxxxxxxxxxxxxxxx,
     "cmd":"<redacted>","fromhost":"<redacted ip>",
     "geoip":{"country_name":"<redacted>","continent_code":"EU","city_name":"<redacted>","location":[xx.520100000000014,yy.39019999999999],"ip":"<redact ip>"},
     "geoasn":{"number":"<redacted>","asn":"<redacted>"}
    }
 }

And have this template for the index:

{
  "template":"popserv-*",
  "settings":{"index":{"number_of_shards":1,"numver_replicas":1,"refresh_interval":"5s"}},
  "mappings":{
    "_default_":{
        "dynamic_templates":[{
            "template1":{
               "mapping":{"ignore_above":64,"index":"not_analyzed","omit_norms":"true","type":"{dynamic_type}","doc_values":true},
               "match":"*"}
        }],
        "_all":{"norms":{"enabled":false},"enabled":true},
        "properties":{
           "@timestamp":{"type":"date"},
           "host":{"index":"not_analyzed","omit_norms":"true","type":"string"},
           "user":{"index":"not_analyzed","omit_norms":"true","type":"string"},
           "mbox":{"index":"not_analyzed","omit_norms":"true","type":"string"},
           "event":{"index":"not_analyzed","omit_norms":"true","type":"string"},
           "type":{"index":"not_analyzed","omit_norms":"true","type":"string"},
           "geoip":{"properties":{"location":{"type":"geo_point"},"ip":{"type":"ip","omit_norms":true,"index":"not_analyzed"}}},
           "fromhost":{"index":"not_analyzed","omit_norms":"true","type":"ip"}
        }
     }
  },
  "aliases":{}
}

My concern are that the mbox field isn't made a string as template says since it it's value isn't quoted '"' like other string fields and that 20 digit numbers cause this BIG_INTEGER issue.

Am I right it's not a string when not returned '"' quoted and if so, how can I force this field to be made a string if not by my template?

Don't suppose query parameter 'lenient' cant help if possible to use from kibana either?

The Mapping API says mbox is a string as expected, but why do a query return unquoted values then...
or am I barking up the wrong tree with this BIG_INTEGER though I'm pretty sure it docs like this that break kibana queries...

[root@cbdA ~]# curl -XGET 'http://localhost:9200/popserv-2016.05.25/_mapping'
{"popserv-2016.05.25":{
 "mappings":{
    "popserv":{"_all":{"enabled":true,"omit_norms":true},
      "dynamic_templates":[{"template1":{"mapping":{"ignore_above":64,"index":"not_analyzed","omit_norms":"true","type":"{dynamic_type}","doc_values":true},"match":"*"}}],
      "properties":{
        "@timestamp":{"type":"date","format":"strict_date_optional_time||epoch_millis"},
        "cmd":{"type":"string","index":"not_analyzed","ignore_above":64},
        "dirdelay":{"type":"long"},"event":{"type":"string","index":"not_analyzed"},
        "fromhost":{"type":"ip"},
        "geoasn":{"properties":{"asn":{"type":"string","index":"not_analyzed","ignore_above":64},"number":{"type":"string","index":"not_analyzed","ignore_above":64}}},
        "geoip":{"properties":{"city_name":{"type":"string","index":"not_analyzed","ignore_above":64},"continent_code":{"type":"string","index":"not_analyzed","ignore_above":64},"country_name":{"type":"string","index":"not_analyzed","ignore_above":64},"ip":{"type":"ip"},"location":{"type":"geo_point"}}},
        "host":{"type":"string","index":"not_analyzed"},
        "lglvl":{"type":"string","index":"not_analyzed","ignore_above":64},
        "mbox":{"type":"string","index":"not_analyzed"},
        "message":{"type":"string","index":"not_analyzed","ignore_above":64},
        "msgid":{"type":"string","index":"not_analyzed","ignore_above":64},
        "mss":{"type":"string","index":"not_analyzed","ignore_above":64},
        "node":{"type":"string","index":"not_analyzed","ignore_above":64},
        "nodePort":{"type":"long"},
        "port":{"type":"long"},
        "size":{"type":"long"},
        "time":{"type":"long"},
        "type":{"type":"string","index":"not_analyzed"},
        "user":{"type":"string","index":"not_analyzed"}}},
    "_default_":{"_all":{"enabled":true,"omit_norms":true},
      "dynamic_templates":[{"template1":{"mapping":{"ignore_above":64,"index":"not_analyzed","omit_norms":"true","type":"{dynamic_type}","doc_values":true},"match":"*"}}],
      "properties":{
        "@timestamp":{"type":"date","format":"strict_date_optional_time||epoch_millis"},
        "event":{"type":"string","index":"not_analyzed"},
        "fromhost":{"type":"ip"},
        "geoip":{"properties":{"ip":{"type":"ip"},"location":{"type":"geo_point"}}},
        "host":{"type":"string","index":"not_analyzed"},
        "mbox":{"type":"string","index":"not_analyzed"},
        "type":{"type":"string","index":"not_analyzed"},
        "user":{"type":"string","index":"not_analyzed"}}}
}}}

If I index a doc and prefix my large number for mbox field with some letters then query API returns a '"' quoted value and thus kibana sees no issue with BIG_INTEGER... seems a bug to me that ES tries to return a number though the type is mapped as a string, just because it see only numbers in the string value...
How to report this bug... I wonder

[root@cbdA ~]# curl -XGET 'http://localhost:9200/popserv-2016.05.25/_search?q=mbox=id9<redacted 19 digits>'
{..."mbox":"id9xxxxxxxxxxxxxxxxxxx"...}

:blush: it's not a ES bug,
it's my own mistake, had a logstash filter that splits key/value pairs into fields with this ruby bit:

          k = kvs[0]
          v = kvs[1]
          if v.match('\A\d+\Z')
            event[k] = v.to_i
          elsif v.match('\A\d+\.\d+\Z')
            event[k] = v.to_f
          else
            event[k] = v
          end

changing it to this seems to avoid large integers

          k = kvs[0]
          v = kvs[1]
          if v.match('\A\d+\Z') and v.length <= 18
            event[k] = v.to_i
          elsif v.match('\A\d+\.\d+\Z')
            event[k] = v.to_f
          else
            event[k] = v
          end