Field explicitly mapped as type long is rolling over to negative at 2147483647

I have a field that I've explicitly mapped to be of type long, but I see it rolling over to negative at the max value for a signed 32 bit integer.

My hunch is that it has something to do with the Painless script that I'm using to update it. I'm guessing that it is doing the math with 32 bit integers and then just assigning it to my long field.

The script is pretty simple, here's an example:

{"upsert": {"isi_tree_size": 289000},
"script": {
"lang": "painless", "inline": "ctx._source.isi_tree_size += params.isi_tree_size",
"params": {"isi_tree_size": 289000}}}

Could that be the problem? If so how do I make sure the field is treated as a long and not an int?

I'm using version 5.0.0-alpha5 of ElasticSearch.

I create the index with an explicit mapping for the isi_tree_size field to be long and when I query for the mapping the response appears to confirm that isi_tree_size is a long.

I did a test where I set the field to 2147483647 then I used my script to add 1 to it and it always rolls over to -1.

But from that I was able to figure out that if I change the "inline" script to:

ctx._source.isi_tree_size = (long)ctx._source.isi_tree_size + (long)params.isi_tree_size

Then I can add 1 to 2147483647 and it will become 2147483648.

That's very surprising.