Based on the documentation for the mutate filter it looks like I cannot convert a value to a double
, only a float
.
Is there a way with a Ruby filter to convert to a double instead?
Based on the documentation for the mutate filter it looks like I cannot convert a value to a double
, only a float
.
Is there a way with a Ruby filter to convert to a double instead?
"float" should be interpreted as "floating-point number", not "the single-precision floating-point data type known from C". All floating-point numbers in Ruby (and therefore Logstash) have double precision.
Interesting, thanks for the info. That's inconsistent with Elasticsearch's numeric types where they do differentiate between float
and double
.
Problem I'm having is, I'm using Dremio to sit between ES and some business intelligence software that only speaks SQL. Dremio enforces what it finds from the mapping - i.e., since the mapping is float
, it will cast the value to float
even though ES has a double
in the field.
Here is the mapping of a sample ES document:
{
"double_test": {
"mappings": {
"test_point": {
"properties": {
"value": {
"type": "float"
}
}
}
}
}
}
and the document itself:
{
"_index": "double_test",
"_type": "test_point",
"_id": "1",
"_version": 1,
"found": true,
"_source": {
"value": 3.141592653589793
}
}
Within Dremio, the value is strictly interpreted as a float
:
So I suppose my next question now is, is there a way to set up ES such that the default numeric type is double
, not float
? (I realize that I can set mappings, but the name of the field is not constant.)
This is really an Elasticsearch question, but you should be able to do something with dynamic mappings.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.