I have a function score using a gaussian decay. I would like the scores to never be zero after the multiplier so that when the gaussian function gets to zero, the documents still retained some relevance score.
Ideally, if we could add an arbitrary value to the gaussian value before the multiplication, this would solve my issue.
Does anyone have any consideration how to get this to work?
My interim (but not ideal) solution is to filter on the decay function to a certain radius to prevent multiply by zero and then add another function with a filter beyond that same radius and multiply by a very small value.
function_score allows you to combine functions in any arbitrary way. For example, to avoid 0 scores, you can add an arbitrary value 0.001 to your gaussian values by using weight of 0.001 and score_mode of sum:
Thank you for the reply and the suggestion. I think that's something I'll look at working with. I need to run some tests but I think that'll work. Thank you
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.