Float type issue, should it be resolved at the ingestion process or before?

Hello,

I have some problems with floats. The data is coming from a csv file and the source looks like that

"name","avg_rank"
"richard","1,65"
"john","8,34"
"dan","0,78"
"alicia","14,659"

Now, because elasticsearch was treating avg_rank values as strings, I've done something like that in my logstash config file :

mutate { convert => {"avg_rank" => "float"} }

Looking into the hits quering , I can see for example that avg_rank value for john is 834.0 instead of 8.34

{
	"name": "john"
	"avg_rank" : 834.0
}

Could this be resolved only upstream working on my data source (replacing commas by dots for example) or is there ways to make logstash dealing with it ?

Thank you ! :slightly_smiling_face:

For the moment I've resolved working on the CSV input, but I'll be happy if anyone has some suggestions to deal with not perfectly formated file like that, where average_rank is supposed to be a float

"name","avg_rank"
"richard","1,65"
"john","8,34"
"dan","0,78"
"alicia","14,659"

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.