Nest cutting off trailing zero

I have Latitude and Longitude fields in my index stored as double that can contain trailing zero's which we treat as important ex: 37.15580. When I query elasticsearch through curl or kibana this field correctly returns all digits including the trailing zero, however when I query for this document through C# NEST or elasticsearch's python library this trailing zero is removed. Is there any way to change this behavior so the trailing zero's are preserved?

To clarify, does the JSON response from Elasticsearch contain the trailing zero, but when serializing the double again to which the latitude or longitude field is deserialized into, the trailing zero is trimmed?

Why are trailing zeroes important?

The response I am getting is using this Nest code (which cuts off trailing zero before it gets out of what Nest is doing):
var results = ElasticClient.Search(s => s
.Index(IndexProperties.POSTAL_INDEX)
.Size(1)
.Query(q => q
.Bool(b => b
.Must(mu => mu
.Match(ma => ma.Field(f => f.CityName).Query(sCityName).MinimumShouldMatch(MinimumShouldMatch.Percentage(100))))
.Filter(f => f
.Term(t => t.Field(fi => fi.CountryCode).Value(sCountryCode)) && f
.Term(t => t.Field(fi => fi.StateCode).Value(sStateCode)) && f
.Term(t => t.Field(fi => fi.PostalCode).Value(sPostalCode))))));

This code using the low level client returns the latitude in the way I want it too:
var lowlevelClient = new ElasticLowLevelClient();

        var searchResponse = lowlevelClient.Search<StringResponse>("mdm_postalgeo", PostData.Serializable(new
        {
            from = 0,
            size = 10,
            query = new
            {
                term = new
                {
                    PostalCode = "24059"
                }
            }
        }));

The zeroes are important for us because they are indicating the degree of precision for the given latitude and longitude.

Where in the response are you looking at double values? For example, on results.Documents.Select(d => d.Latitude)?

That'll be returning the JSON string representation of the numeric value. The same representation will be returned to NEST too, which is why I'd like to understand where you are looking at double values. Would you be able to provide a small code example, or screenshot?

Yes I am seeing is an element under results.Documents.Select. I have also looked under 'Hits' and found it also cut the zero off there. The latitude in the picture is 37.15580 in the actual elastic index.

ElasticTrailingZero

This is not a NEST issue, but the way that doubles are handled in C#/CLR, according to IEEE 754; when a string representation of a double value is parsed to a System.Double type, trailing fractional zero significant digits are truncated

void Main()
{

	var value1 = double.Parse("37.1558");

    // could make this 37.1558 followed by any number of trailing zeroes
	var value2 = double.Parse("37.15580");

	Console.WriteLine($"value1: {value1}");
	Console.WriteLine($"value2: {value2}");

    // put aside comparing doubles for equality with == for the moment :)
	Console.WriteLine($"value1 == value2: {value1 == value2}");
}

yields

value1: 37.1558
value2: 37.1558
value1 == value2: True

I would expect the same behaviour in other languages that adhere to IEEE 754.

Using System.Decimal will preserve the precision

// same code as before, but using Decimal
var value1 = Decimal.Parse("37.1558");
var value2 = Decimal.Parse("37.15580");

yields

value1: 37.1558
value2: 37.15580
value1 == value2: True

However, Elasticsearch internally will index these values as double numeric types, and any calculations within Elasticsearch will be performed with this precision.

There are a couple of ways that you may want to retain this information:

  1. Create your own GeoLocation type that uses System.Decimal for latitude and longitude properties, that you can deserialize these values into.
  2. Store the precision of the values as a separate field or fields that can be used to inform your usage e.g. when values are .ToString()'ed, etc.

Thanks! That was very helpful I'll take a look as those options.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.