Issue in import csv functionality with timestamp field contains spaces and kommas

Hello,

Kibana 7.10.1. Tried to use the import csv functionality and for some reason is the first colomn header replaced by a value with a front and end space..

image

image

Seem like a bug? The csv was exported from Excel.

Also I seem to still have the exact same issue as I had in April 2019 =https://discuss.elastic.co/t/unable-to-import-csv-with-floats-in-7-x-rc2/175918

image

Values with a ',' (which is a komma in europe) keeps being assigned a keyword mapping. Even when overriding to a long, uploading seems to fail, the index ends up emtpy and I get:

Willem

Hi Willem,

You can convert the European format into the correct format using a painless script in the ingest pipeline like:

{
  "script": {
    "lang": "painless",
    "source": "ctx.totalGB = NumberFormat.getInstance(Locale.ITALY).parse(ctx.totalGB);",
  }
}

see https://www.elastic.co/guide/en/elasticsearch/reference/master/script-processor.html

Use the correct Locale constant depending on the exported locale from excel.

If I remember correctly, excel also allows you to export numbers without formatting them in the current locale when exporting a CSV

For the spaces issue, please double check if the csv first column header has some hidden characters (could happen if the text was copied from an html file or something)
In anycase you can always rename your field changing the pipeline with:

{
  "rename": {
    "field": "provider",
    "target_field": "cloud.provider"
  }
}

see https://www.elastic.co/guide/en/elasticsearch/reference/master/rename-processor.html

@markov00 Thank you very much for your answers.

Although changing the locale in Excell would be a solution, this is not an option for me, as using a komma , to split the decimals is standard practice in Europe.

Using the painless script could also be an option, but imho this (basic) import csv functionality is getting somehow complex. In the meantime I switched to PowerBI and was able to show the graphs I needed in less then 2 minutes. Could an enhancement request be an idea, so European decimal format is recognized out of the box?

(also, I triple checked the column headers and there were definitely no special chars in it)

Thanks

Willem