How to read csv data in logstash and convert column to date,integer,boolean etc

I got error when creating the index in logstash

Could not find log4j2 configuration at path /ElasticProducts/logstash-5.2.0/config/log4j2.properties. Using default config which logs to console
08:38:21.769 [LogStash::Runner] ERROR logstash.agent - fetched an invalid config {:config=>"input {\n\tstdin {\n\t\t\ttype => "stdin-type"\n\t\t}\n\tfile

config file as below

input {
stdin {
type => "stdin-type"
}
file {
path => ["C:/Elastic/Data.csv"]
start_position => "beginning"
}
}

filter {
csv {
columns => ["ITIL_Incident_ID","Incident_Number","Incident_Create_Date","Incident_Created_By_Name","P1_Incident_Flag"]

separator => ","

convert => {

"ITIL_Incident_ID" => "integer" ,
"P1_Incident_Flag" => "boolean"

			}
}

date {
match => [ "Incident_Create_Date" , "mm/dd/yy HH:mm" ]
}

}

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "csvfile"
}
stdout { codec => rubydebug }
}

Can you share the entire error?

Could not find log4j2 configuration at path /ElasticProducts/logstash-5.2.0/config/log4j2.properties. Using default conf
ig which logs to console
21:40:56.393 [LogStash::Runner] ERROR logstash.agent - fetched an invalid config {:config=>"input {\n\n\tstdin {\n\t\t\t
path => ["C:/ElasticProducts/IncidentTestData.csv"]\n\t\t\tstart_position => "beginning"\n\t\t}\n\t}\n\nfilter {\ncs
v {\n\n\t\tcolumns => ["ITIL_Incident_ID","Incident_Number","Incident_Create_Date","Incident_Created_By_Name","
P1_Incident_Flag"]\n\n\t\tseparator => ","\n\n\t\tconvert => { \n\n [ "ITIL_Incident_ID" => "integer" ]\n\t[
"P1_Incident_Flag" => "integer" ]\n\n \t\t }\n\n\tdate {\n \t\tmatch => [ "Incident_Create_Date" , "mm/dd/yy
HH:mm" ]\n \t\t}\n \t}\n\n}\n \n \n\noutput {\n elasticsearch {\n hosts => ["localhost:9200"]\n user => "e
lastic"\n password => "changeme" \n # protocol => "http"\n index => "csvfile"\n #template => ""\n
}\n # stdout { codec => rubydebug }\n}\n", :reason=>"Expected one of #, -, ", ', } at line 18, column 5 (byte 294) afte
r filter {\ncsv {\n\n\t\tcolumns => ["ITIL_Incident_ID","Incident_Number","Incident_Create_Date","Incident_Create
d_By_Name","P1_Incident_Flag"]\n\n\t\tseparator => ","\n\n\t\tconvert => { \n\n "}

New config file is

input {

stdin {
		path => ["C:/ElasticProducts/IncidentTestData.csv"]
		start_position => "beginning"
	}
}

filter {
csv {

	columns => ["ITIL_Incident_ID","Incident_Number","Incident_Create_Date","Incident_Created_By_Name","P1_Incident_Flag"]

	separator => ","
	convert => { 

[ "ITIL_Incident_ID" => "integer" ]
[ "P1_Incident_Flag" => "integer" ]

		 }

date {
		match => [ "Incident_Create_Date" , "mm/dd/yy HH:mm" ]
	}
}

}

output {
elasticsearch {
hosts => ["localhost:9200"]
user => "elastic"
password => "changeme"
# protocol => "http"
index => "csvfile"
#template => ""
}

stdout { codec => rubydebug }

}

The convert syntax is wrong, check out https://www.elastic.co/guide/en/logstash/current/plugins-filters-csv.html#plugins-filters-csv-convert

i still got error even after changing the convert format

PS C:\ElasticProducts\logstash-5.2.0\bin> logstash -f csv.conf

Could not find log4j2 configuration at path /ElasticProducts/logstash-5.2.0/config/log4j2.properties. Using default config which logs to console
22:44:58.945 [LogStash::Runner] ERROR logstash.agent - fetched an invalid config {:config=>"input {\n\n\tfile {\n\t\t\tpath => ["C:/ElasticProducts/IncidentTestData.csv"]\n\t\t\tstart_positi
on => "beginning"\n\t\t\tsincedb_path => "/dev/null"\n\t\t}\n\t}\n\nfilter {\ncsv {\n\n\t\tcolumns => ["ITIL_Incident_ID","Incident_Number","Incident_Create_Date","Incident_Created_
By_Name","P1_Incident_Flag"]\n\n\t\tseparator => ","\n\n\tconvert => { "ITIL_Incident_ID" => "integer" , "P1_Incident_Flag" => "boolean" }\n\n \t}\n\n \tdate {\n \t\tmatch =

["Incident_Create_Date" , "mm/dd/yy HH:mm"]\n \t }\n}\n \n \n\noutput {\n elasticsearch {\n hosts => ["localhost:9200"]\n user => "elastic"\n password => "changeme" \n
# protocol => "http"\n index => "csvfile"\n #template => ""\n }\n stdout {\n id => "ITIL_Incident_ID"\n }\n}\n", :reason=>"Expected one of #, {, -, ", ', } at line 17, c
olumn 47 (byte 349) after filter {\ncsv {\n\n\t\tcolumns => ["ITIL_Incident_ID","Incident_Number","Incident_Create_Date","Incident_Created_By_Name","P1_Incident_Flag"]\n\n\t\tseparat
or => ","\n\n\tconvert => { "ITIL_Incident_ID" => "integer" "}

config file is as below

filter {
csv {

	columns => ["ITIL_Incident_ID","Incident_Number","Incident_Create_Date","Incident_Created_By_Name","P1_Incident_Flag"]
	separator => ","

convert => { "ITIL_Incident_ID" => "integer" , "P1_Incident_Flag" => "integer" }

}

date {
		match => ["Incident_Create_Date" , "mm/dd/yy HH:mm"]
	 }

}

Please make sure you provide the entire error!

it has full error message. it saying error in line 17 column 47 now

There shouldn't be any commas between the hash items, i.e.

convert => { "ITIL_Incident_ID" => "integer" , "P1_Incident_Flag" => "integer" }

should be

convert => { "ITIL_Incident_ID" => "integer" "P1_Incident_Flag" => "integer" }

This is a documentation bug that was fixed a long time ago but a new plugin release hasn't been made so elastic.co is out of date.

Index created successfully, But still datatype is string and there is multiple fields created for same field. Eg : Incident_Create_Date and Incident_Create_Date.keyword

Why is that ?

We don't know what you want to do with the mappings, so we index the fields as text and keyword.

A keyword is https://www.elastic.co/guide/en/elasticsearch/reference/5.2/keyword.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.