Wrong Long vlaue when store long value in es

today i met one long value convert issue,
for clear seeing this issue, i capture one image, as following:

logstash: 2.1.1
elasticsearch: 2.1.1
completely logstash config file is shown as following,
input {
stdin { }
}
filter {
grok {
match => {
"message" => "%{TIMESTAMP_ISO8601:logging_time}\t%{INT:biz_type}\t%{GREEDYDATA:data}\t%{IP:local_ip}"
}
}
if [biz_type] {
if ( [data] and [data] =~ /(.){(.+)}(.)/ ) {
json {
source => "data"
remove_field => [ "data" ]
}
}
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["localhost:9200"]
document_type => "biz_log_record_%{biz_type}"
index => "bizlog-%{+YYYY-MM-dd}"
template_name => "template_revised"
template => "/programs/server/logstash-2.1.1/template/dada-template.json"
template_overwrite => true
manage_template => true
}
}

detail logstash info and es info , as following,
2016-06-24 15:54:33 10050 {"transporter_id_list": [605603], "push_count": 1, "device_type": 1, "request_id": 1466754873939567710, "is_success": true, "push_id": null, "area_id": 10, "time_type": "test"} 10.10.183.101
{
"@version" => "1",
"@timestamp" => "2016-06-24T10:31:43.531Z",
"host" => "cookorg.local",
"logging_time" => "2016-06-24 15:54:33",
"biz_type" => "10050",
"local_ip" => "10.10.183.101",
"transporter_id_list" => [
[0] 605603
],
"push_count" => 1,
"device_type" => 1,
"request_id" => 1466754873939567710,
"is_success" => true,
"push_id" => nil,
"area_id" => 10,
"time_type" => "test"
}

in the es storage, i see different request_id value, as following:
{
"_index": "bizlog-2016-06-24",
"_type": "biz_log_record_10050",
"_id": "AVWB9x3-qfZHqOoPMgIt",
"_version": 1,
"_score": 1,
"_source": {
"@version": "1",
"@timestamp": "2016-06-24T10:31:43.531Z",
"host": "cookorg.local",
"logging_time": "2016-06-24 15:54:33",
"biz_type": "10050",
"local_ip": "10.10.183.101",
"transporter_id_list": [
605603
],
"push_count": 1,
"device_type": 1,
"request_id": 1466754873939567600,
"is_success": true,
"push_id": null,
"area_id": 10,
"time_type": "test"
}
}

does someone met this issue?

i don't get it.
don't you think this is an issue?

I moved the discussion to logstash.

To me, it's not happening in Elasticsearch but in logstash.
I mean that Elasticsearch stores the JSON content as is.

in my opinion, logstash parse the line well,
the issue happened when store into es

What's the mapping of the field? Could the problem be related to https://github.com/elastic/elasticsearch/issues/14506, i.e. that ES stores things correctly but that the JavaScript library used to extract the documents doesn't handle a sufficient number of significant digits?

maybe it just is issue when being display in kibana