Logstash Pipeline CSV Date Filter Issue

Hello,

I have an issue with pipeline which should fetch data from CSV file. When I'm adding a "Data" to filter section because I need to have it in Elastic as a date it does not want to work.

The config is below:

input {
file {
path => "${STATEMENT_FILEPATH}/csv_test7.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
skip_header => "true"
columns => ["No.","Time","Source","Destination","Protocol","Length","Info"]
convert => {
"No." => "integer"
"Time" => "date_time"
}
}
date {
match => [ "Time", "ISO8601"]
timezone => "Europe/Warsaw"
target => "Time"
}
}
output {
elasticsearch {
hosts => ["${ES_HOST1}","${ES_HOST2}","${ES_HOST3}"]
index => "csv_test7"
#document_id => "%{[@metadata][fingerprint]}"
user => "${logstash_user}"
password => "${logstash_password}"
ssl => true
ssl_certificate_verification => false
cacert => "${CACERT}"
}
}

Data looks like below:

Hey, you don't need to convert Time to date_time try using this in filter block:

filter {
	csv {
		separator => ","
		skip_header => "true"
		columns => ["No.","Time","Source","Destination","Protocol","Length","Info"]
	convert => 
			{
			"No." => "integer"
		}
	}
	date {
		match => [ "Time", "ISO8601"]
		timezone => "Europe/Warsaw"
		target => "Time"
	}
}

Always make sure to press ctrl+ E to format the code you are placing in the question.