Please help - date parse error (a very simple dd-mm-yyyy) csv to logstash

my CSV has the following:
gymid,gymdate,sets,exercise,worktime,rest,reps,weight,muclegroup
3013,20-03-2020,1,Barbell Deadlift,01.42,03.02,11,47,back

my config has:

input
{

file{
    path=>"D:/gym1.csv"
    start_position=>"beginning"
    sincedb_path=> "NUL"
}

}

filter
{
csv {
separator=>","
columns=> ["gymid","gymdate", "sets", "exercise", "worktime", "rest", "reps", "weight", "musclegroup"]

}

date {

match => ["gymdate", "dd-mm-yyyy"]

target => "@timestamp"

}

mutate {convert => ["sets", "integer"]}

mutate {convert => ["reps", "integer"]}

mutate {convert => ["worktime", "float"]}

mutate {convert => ["rest", "float"]}

mutate {convert => ["reps", "float"]}

mutate {convert => ["weight", "integer"]}

}

output

{

elasticsearch{

hosts=> "localhost:9200"

index=>"gym"

document_type => "workout"

}

stdout{}

}

i get dateparseerror

Month should be MM, not mm, although I am surprised you get an error for that.

thanks a ton this solved the problem. i saw different behaviours each time. Once it did not give me error but simply take it as a string and when sometimes the error that came up. Also until the time i used the timezone of UTC, the date kept showing some other values
thank you for helping.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.