About the "convert" function for CSVs

Hey there !

I've got some questions about integrate CSVs into elasticsearch with Logstash.

I've got a CSV with some datas like text and numbers (obviously), but also with IPv4 adresses and possibly Localisations (can countries abreviations like EN, DE etc... anc be geo_points datas ?).

I saw first introduce CSV into elastic is putting all my datas in integer format. Next some documentations, i saw there's a function for CSV which can convert your datas in what you want (the "convert function").

So there's my question : the "convert" function is really only for convert to integer, float, date, date_time and boolean, Or you can use this one too for convert into ipv4 and geo_point data ? If no, there's a way to convert those datas next the introduction into elastic ?

Thanks in advance !

anyone ?

geo_points and IP addresses can not be automatically detected by Elasticsearch, which means that it is not enough to format them in a specific way in Logstash. You will need to define the mapping of these fields in an index template.

Yeah that's what i though...

Thanks !

It's me again.

When I tried to put the convert function (on csv or even on mutate filter), it doesn't work at all, in kibana my number column stays as string. Maybe it comes from my index ?

If the field has already been mapped as a string in the index, the mapping will not change. If you however index into a new index it should be able to correctly identify the mapping.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.