Long data type value corruption

I am using Elasticsearch version 5.1.2. There is some kind of number overflow when I try to ingest long value close to Long.MAX_SIZE.
I have tried the same with ES 2.4 as well. Same issue again.

PUT /library/books/1
{

"longValue1" : -9223372036854775808,
"longValue2" : 9223372036854775807

}

GET /library/_search

This returns:
{
"took": 10,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"failed": 0
},
"hits": {
"total": 1,
"max_score": 1,
"hits": [
{
"_index": "library",
"_type": "books",
"_id": "1",
"_score": 1,
"_source": {
"longValue1": -9223372036854776000,
"longValue2": 9223372036854776000
}
}
]
}
}

Value: -9223372036854775808 becomes -9223372036854776000
Value: 9223372036854775807 becomes 9223372036854776000

Is this an known issue?
Please advise.

Thanks,
Sanal

Same issue here!

We use similar long numeric values for internal IDs, but we store them as strings because of this issue. I'm not sure if it will help you, but it works well for us.

If we map the data type as string during ingest, the _mapping for the respective index/type shows data type as "string" instead of "long" which is expected.

If one needs to retain the _mapping as "long", is there any way out?

From what I understand, the issue isn't actually Elasticsearch, which properly supports long values, but rather JavaScript/JSON, which does not. If you can get your data into and out of ES without type conversion problems, you'll be all set. We use JavaScript, so we had to go the string route.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.