Bigint values in elasticsearch

currently I am using logstash to query the mysql db and index data in elasticsearch. I currently see the following error,

@data={"id"=>592753, "source"=>"Add new address", "target"=>"加入新的地址", "lastmodified"=>"2008-04-07T23:04:26.000Z", "sourcedigest"=>12120909617543216096, "targetdigest"=>8116712605014414362, "sourcewordcount"=>3, "targetwordcount"=>6, "upload_id"=>399, "@version"=>"1", "@timestamp"=>"2017-01-25T14:54:09.879Z", "tags"=>["_elasticsearch_lookup_failure"]}, @metadata_accessors=#<LogStash::Util::Accessors:0x3543bfdb @store={}, @lut={}>, @cancelled=false>], :response=>{"index"=>{"_index"=>"en-us_zh-hk", "_type"=>"segment", "_id"=>"592753", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"No matching token for number_type [BIG_INTEGER]"}}}}, :level=>:warn}

The "sourcedigest" field seems to be the issue here, as it has a field type long. but the actual value cannot be stored in the long field. Do I have solution/workaround here?

Send as a string mapped to a field of type keyword. It looks like you'll need to wrap the number in quotes too otherwise it is returned in a truncated form in results:

DELETE test
PUT test
{
   "settings": {
	  "index": {
		 "number_of_shards": 1
	  }    
   },
   "mappings": {
	  "doc": {
		 "properties": {
			"num": {
			   "type": "keyword"
			}
		 }
	  }
   }
}
POST test/doc/bad
{
	"num":12120909617543216096
}
POST test/doc/ok
{
	"num":"12120909617543216096"
}
GET test/doc/_search
{
	"query":{
		"match":{
			"num":12120909617543216096
		}
	}
}
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.