Logstash: Not able to parse date field from csv

I'm not able to parse date field ("Time") from given csv file. it's giving error all the time.
Help will be appreciated.

CSV
Time,TotalMsg,CancelMsg,NEWMsg,ModifyMsg,CancelThrottle,ModifyThrottle,NEWThrottle,TOTALThrottle,NewThrottlePercent,ModifyThrottlePercent,CancelThrottlePercent
20190215-09:15,53867,3178,2724,47965,1258,5156,99,6513,3,10,39
20190215-09:16,31272,1051,1156,29065,98,2267,0,2365,0,7,9
20190215-09:17,21773,831,1044,19898,10,369,0,379,0,1,1

Logstash config
input {

file {

path => "/home/ramesh/Final.Throttle.txt*"

start_position => "beginning"

sincedb_path => "/dev/null"

}

}

filter {

csv {
columns => [
"Time",
"TotalMsg",
"CancelMsg",
"NEWMsg",
"ModifyMsg",
"CancelThrottle",
"ModifyThrottle",
"NEWThrottle",
"TOTALThrottle",
"NewThrottlePercent",
"ModifyThrottlePercent",
"CancelThrottlePercent"
]
separator => ","
remove_field => ["message"]
}
date {
match => ["Time","yyyyMMdd-HH:mm"]
target => "Time"
}

}

output {

elasticsearch {

hosts => ["localhost:9200"]
index => "dataset"
}

stdout {}

}

Error
[2019-02-26T17:04:48,873][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"dataset", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x44b0d1f4], :response=>{"index"=>{"_index"=>"dataset", "_type"=>"doc", "_id"=>"c9aVKWkBr_HsS0z9-aV-", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [Time] of type [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "20190215-15:25" is malformed at "0215-15:25""}}}}}

If the date filter was working then the Time field would no longer contain "20190215-15:25" when it got to elasticsearch. And that date filter would parse that field if it had that format. So there has to be something you are not telling us...

Nothing to hide here. You can tell me what to explore more to fix this issue and so i can dig into it.
You can see below log for more details.

[2019-02-26T17:02:19,658][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-02-26T17:02:19,825][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-02-26T17:02:19,871][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-02-26T17:02:19,875][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2019-02-26T17:02:19,900][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-02-26T17:02:19,914][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2019-02-26T17:02:19,930][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-02-26T17:02:20,153][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x363ff72f run>"}
[2019-02-26T17:02:20,209][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-02-26T17:02:20,212][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-02-26T17:02:20,481][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-02-26T17:02:21,199][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"dataset", :_type=>"doc", :routing=>nil}, #LogStash::Event:0xaf89fc1], :response=>{"index"=>{"_index"=>"dataset", "_type"=>"doc", "_id"=>"P9aTKWkBr_HsS0z9uaL8", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [Time] of type [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "20190215-09:19" is malformed at "0215-09:19""}}}}}
[2019-02-26T17:02:21,206][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"dataset", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x4e13d656], :response=>{"index"=>{"_index"=>"dataset", "_type"=>"doc", "_id"=>"ltaTKWkBr_HsS0z9uqIO", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [Time] of type [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "20190215-09:20" is malformed at "0215-09:20""}}}}}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.