Problem when loading CSV file using logstash

Hello everyone,

I am trying to load a CSV file with the following config :

###############################################################################################################

input
{
	file
	{
		path => "C:/Users/BEKRISO/KIBANA7.0.1/INPUT/ESHMA_V2_9r_piste_audit.csv"
		start_position => "beginning"
		sincedb_path => "C:/Users/BEKRISO/KIBANA7.0.1/sincedb"
		codec => plain{charset =>"UTF-8"}
	}
}

############################################################################################################################

filter
{
	csv
	{
		separator => ","
		
		columns => ["Date et heure","Utilisateur","Code","Libelle evenement","Code retour","Application","Code site","Type de table","Objet Start","Usage cache","Valeur avant modif","Valeur apres modif","SDT"]
	}

		
	mutate{
	
		convert => { 
			
			"Date et heure" => "string"
			"Utilisateur" => "string" 
			"Code" => "integer" 
			"Libellé évènement" => "string" 
			"Code retour" => "string" 
			"Application" => "string" 
			"Code site" => "integer" 
			"Type de table" => "string"
			"Objet Start" => "string" 
			"Usage cache" => "string" 						
			"Valeur avant modif" => "string" 
			"Valeur après modif" => "string"
			"SDT" => "string"
		
		}
		
		#Gestion des accents
		rename => { "Libelle evenement" => "Libellé évènement"  
					"Valeur apres modif" => "Valeur après modif" }
		
		#Suppression des carriage returns(\r) du dernier champs
		gsub => [ "message", "[\r]", "" ]		
	
	}
	
	date {  match => [ "Date et heure", "dd/MM/YY HH:mm" ] }
	

				
}

##############################################################################################################################

output
{
	elasticsearch
	{
		hosts => "cas0000658713:9200"
		index => "monbeaunode_1"
	}
	
stdout { codec => rubydebug }

}

and with the following mapping :

PUT monbeaunode_1/_mapping
{
   "properties":{
      "Date et heure":{
         "type":"text"
      },
      "Utilisateur":{
         "type":"keyword"
      },
      "Code":{
         "type":"keyword"
      },
      "Libellé évènement":{
         "type":"text"
      },
      "Code retour":{
         "type":"keyword"
      },
      "Application":{
         "type":"keyword"
      },
      "Code site":{
         "type":"keyword"
      },
      "Type de table":{
         "type":"keyword"
      },
      "Objet start":{
         "type":"keyword"
      },
      "Usage cache":{
         "type":"keyword"
      },
      "Valeur avant modif":{
         "type":"text"
      },
      "Valeur après modif":{
         "type":"text"
      },
      "SDT":{
         "type":"keyword"
      }
   }
}

but data isn't loaded, when I check the fields contents in Kibana, I find only the name of any field like in the following picture :

Can anyone help me PLS,

Are you sure the date picker range you have selected includes the timestamps on the CSV records?

Thank you for the solution, and I would like to know what does this WARNING means :

[2019-06-25T09:25:47,152][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"monbeaunode_1", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x1298297>], :response=>{"index"=>{"_index"=>"monbeaunode_1", "_type"=>"_doc", "_id"=>"x6-FjWsBVpSJqxpd81Ir", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [Date et heure] of type [date] in document with id 'x6-FjWsBVpSJqxpd81Ir'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [19/06/2019 15:41] with format [dd/MM/YY HH:mm]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Text '19/06/2019 15:41' could not be parsed at index 8"}}}}}}

Here is your error with date format. It's failing while parsing the 'Date et heure' field using date filter

Can you try with this format and test once !
dd/MM/yyyy HH:mm

thank you for your response,
I still have a problem, here is an exemple for one message of my csv,
We see that in the massage the year for the date field is 2019 but it loads 1970,
Can anyone help PLS

Application:
    GL
Valeur après modif:
    - 
path:
    C:/Users/BEKRISO/KIBANA7.0.1/INPUT/ESHMA_V2_9r_piste_audit.csv
@version:
    1
@timestamp:
    Jun 25, 2019 @ 10:49:51.190
host:
    CAS0000658713
Libellé évènement:
    Appel à la passerelle par une application cliente
Usage cache:
    NON
Code site:
    990
Objet Start:
    TA-KVTYPENT-0
Type de table:
    AS
message:
    19/06/2019 17:00,,1,Appel à la passerelle par une application cliente,05,GL,0990,AS,TA-KVTYPENT-0,NON,V1,,
Utilisateur:
    - 
Code retour:
    05
Date et heure:
    Jun 19, 1970 @ 19:00:00.000
Valeur avant modif:
    V1
SDT:
    - 
Code:
    1
_id:
    EoHTjWsBUHuXczI-vNJt
_type:
    _doc
_index:
    monbeaunode_1
_score:
    1

so, now you can see the data being indexed to ElasticSearch. Only issue is with the date right ?

Exactly

Hmm. This is strange. I'm able to convert the exact same date ( 'dd/MM/yyyy HH:mm' ) format using csv + date filters

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.