I am trying to do a custom scoring using a date field. I need the score to take the date into account with millisecond precision. However, when I run the code below, I get a result where the score for both 2020-01-01T00:00:00.100Z
and 2020-01-01T00:00:00.200Z
is 1577836740000
.
1577836740000
is the epoch millisecond for 2019-12-31T23:59:00.000Z
.
The scores should be 1577836800100
and 1577836800200
, respectively.
Is there a way to produce a score that takes the full epoch millisecond into account?
I've tried decay functions as well but they seem to have the same problem. So maybe the dates aren't stored with full precision in the first place?
EDIT: I've since tested it with ES 7.x as well since the underlying date type there is supposed to support milliseconds, however the problem still exists as the score is a double, so the underlying long epoch milliseconds are still cutoff.
Seems to be a duplicate of this issue: More accurate date based scoring
Not sure the answers have changed in the last 3 years.
PUT test-index-date
{
"mappings": {
"item": {
"properties": {
"date": {
"type": "date"
}
}
}
}
}
PUT test-index-date/item/1
{
"date": "2020-01-01T00:00:00.100Z"
}
PUT test-index-date/item/2
{
"date": "2020-01-01T00:00:00.200Z"
}
POST test-index-date/_search
{
"query": {
"bool": {
"should": [
{
"function_score": {
"functions": [
{
"field_value_factor": {
"field": "date",
"modifier": "none"
}
}
]
}
}
]
}
}
}
Result
{
"took": 4,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"skipped": 0,
"failed": 0
},
"hits": {
"total": 2,
"max_score": 1577836740000,
"hits": [
{
"_index": "test-index-date",
"_type": "item",
"_id": "2",
"_score": 1577836740000,
"_source": {
"date": "2020-01-01T00:00:00.200Z"
}
},
{
"_index": "test-index-date",
"_type": "item",
"_id": "1",
"_score": 1577836740000,
"_source": {
"date": "2020-01-01T00:00:00.100Z"
}
}
]
}
}