currently I am using logstash to query the mysql db and index data in elasticsearch. I currently see the following error,
@data={"id"=>592753, "source"=>"Add new address", "target"=>"加入新的地址", "lastmodified"=>"2008-04-07T23:04:26.000Z", "sourcedigest"=>12120909617543216096, "targetdigest"=>8116712605014414362, "sourcewordcount"=>3, "targetwordcount"=>6, "upload_id"=>399, "@version"=>"1", "@timestamp"=>"2017-01-25T14:54:09.879Z", "tags"=>["_elasticsearch_lookup_failure"]}, @metadata_accessors=#<LogStash::Util::Accessors:0x3543bfdb @store={}, @lut={}>, @cancelled=false>], :response=>{"index"=>{"_index"=>"en-us_zh-hk", "_type"=>"segment", "_id"=>"592753", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"No matching token for number_type [BIG_INTEGER]"}}}}, :level=>:warn}
The "sourcedigest" field seems to be the issue here, as it has a field type long. but the actual value cannot be stored in the long field. Do I have solution/workaround here?