Logstash Replace timestamp from csv

I am try to replace the timestamp with the field from log file. But it is not getting updated. Getting Date parse error _dateparsefailure and still not getting updated. Attached the config file for reference.

Date in the Log file format is :6/19/2018 9:42:00 PM

input {
file {
path => "D:/IntraDay1.csv"
start_position => beginning
# to read from the beginning of file
sincedb_path => "/dev/null"
}
}

filter {

csv {
   columns => ["Dispo"]

}

mutate {

	convert => { 
		" 
										
	}

}

date {
match => ["DateTime", "yyyy-MM-dd HH:mm:ss,SSS"]
timezone => "UTC"
add_field => { "Status" => "Matched"} # add_tag => [ "timestamp_matched"]

}

}

output {

elasticsearch {

action => "index"
hosts => "localhost:9200"
index => "intraday"
workers => 1

}
stdout { codec => rubydebug }

}

And you are trying to parse that using

match => ["DateTime", "yyyy-MM-dd HH:mm:ss,SSS"]

Try

 date { match => [ "DateTime", "M/dd/yyyy h:mm:ss aa" ] }

Thanks Badger still the same dateparse error and timestamp is not getting updated.

Show us the message from the JSON tab in Kibana Discover

indent preformatted text by 4 spaces

modified the naming convention just for security

date { match => [ "DateTime", "M/dd/yy h:mm aa" ] }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.