ALB-Problem mapping types when importaing CSV via Logstash

Hello, all.

I have a problem when converting or forcing a the types of csv files.

I want to import into Elasticsearch using Logstash the following file:

12/08/2018-17:56:58,4802,34.25.128.24,Zapatillas de deporte

12/08/2018-17:56:59,26314,32.45.115.24,Bombillas de bajo consumo

12/08/2018-17:57:02,28612,10.135.24.255,Bombillas de bajo consumo

12/08/2018-17:57:06,27649,32.45.115.24,Bombillas de bajo consumo

12/08/2018-17:57:07,10745,10.135.24.255,Lampara de led

12/08/2018-17:57:08,4002,34.25.128.24,Zapatillas de deporte

12/08/2018-17:57:10,20914,295.23.1.0,Lampara de led

12/08/2018-17:57:13,22127,10.135.24.255,Bombillas de bajo consumo

12/08/2018-17:57:16,4808,34.25.128.24,Zapatillas de deporte

And I use this conf file:

[root@nodo2 logstash-6.3.1]# cat logstash_csv.conf

input {

file {

path => ["/var/tmp/datos_csv.txt"]

start_position => beginning

}

}

filter {

csv {

separator => ",";

columns => ["momento","cantidad","ip","texto"]

convert => { "momento" => "date" "cantidad" => "integer"}

}

}

output {

stdout { }

elasticsearch {

hosts => [http://192.168.1.140:9200]

index => alex-csv

workers => 1

user => elastic

password => elastic

}

}

The type for field "cantidad" is "long" and the type for the field "momento" is "text" (following an excerpt of the mapping):
....
"cantidad": {
"type": "long";
},
...
"momento": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
...

I cannot see the mistake in the logstash config file. Please, can you help me?.

Thank you very much.

The Ruby CSV converter has no way of knowing whether 12/08/2018 is December 8th or August 12th, so it does not match it. You will need to use a date filter.

Thank you very much, I will try it and will be back with the result.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.