I am using Elasticsearch version 5.1.2. There is some kind of number overflow when I try to ingest long value close to Long.MAX_SIZE.
I have tried the same with ES 2.4 as well. Same issue again.
We use similar long numeric values for internal IDs, but we store them as strings because of this issue. I'm not sure if it will help you, but it works well for us.
If we map the data type as string during ingest, the _mapping for the respective index/type shows data type as "string" instead of "long" which is expected.
If one needs to retain the _mapping as "long", is there any way out?
From what I understand, the issue isn't actually Elasticsearch, which properly supports long values, but rather JavaScript/JSON, which does not. If you can get your data into and out of ES without type conversion problems, you'll be all set. We use JavaScript, so we had to go the string route.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.