BigDecimal in ElasticSearch

Hi

I have an oracle table where most of the number fields are BigDecimal. I
want to migrate all the data from the table to elasticsearch. If I don't
provide any mapping, elasticsearch is taking the bigdecimal values as
double and the precision is lost. I was wondering if elasticsearch
provides the support for BigDecimal field type. My understanding is that it
doesn't support(
http://www.elasticsearch.org/guide/reference/mapping/core-types/). Right
now , as a workaround, I have used an explicit mapping where I mapped all
the BigDecimal fields to String. My question is how do I efficiently index
these bigdecimal fields, because I believe that using a string for
bigdecimal has performance issues.

Thanks
Sameer

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Yes, elastisearch supports it.

What kind of performance issues do you mean? If you want fast floating
points, do not use BigDecimal.

BigDecimal has to be converted to double or formatted to string anyway,
because there is no JSON big decimal support. You can parameterize the
conversion by scale und rounding mode (default is the scale in big
decimal value and rounding mode HALF_UP)

Jörg

Am 15.07.13 08:02, schrieb sameer mohammed:

I was wondering if elasticsearch provides the support for BigDecimal
field type.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.