I'm not able to parse date field ("Time") from given csv file. it's giving error all the time.
Help will be appreciated.
CSV
Time,TotalMsg,CancelMsg,NEWMsg,ModifyMsg,CancelThrottle,ModifyThrottle,NEWThrottle,TOTALThrottle,NewThrottlePercent,ModifyThrottlePercent,CancelThrottlePercent
20190215-09:15,53867,3178,2724,47965,1258,5156,99,6513,3,10,39
20190215-09:16,31272,1051,1156,29065,98,2267,0,2365,0,7,9
20190215-09:17,21773,831,1044,19898,10,369,0,379,0,1,1
Logstash config
input {
file {
path => "/home/ramesh/Final.Throttle.txt*"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
columns => [
"Time",
"TotalMsg",
"CancelMsg",
"NEWMsg",
"ModifyMsg",
"CancelThrottle",
"ModifyThrottle",
"NEWThrottle",
"TOTALThrottle",
"NewThrottlePercent",
"ModifyThrottlePercent",
"CancelThrottlePercent"
]
separator => ","
remove_field => ["message"]
}
date {
match => ["Time","yyyyMMdd-HH:mm"]
target => "Time"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "dataset"
}
stdout {}
}
Error
[2019-02-26T17:04:48,873][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"dataset", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x44b0d1f4], :response=>{"index"=>{"_index"=>"dataset", "_type"=>"doc", "_id"=>"c9aVKWkBr_HsS0z9-aV-", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [Time] of type [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "20190215-15:25" is malformed at "0215-15:25""}}}}}