Having problem to load a text file that contain DATE fields using logstash

Hi,

I am totaly new to the ELK world.
I have tried to load a text file that contain several date fields using logstash.
I have tried to configure the field START_DATE and END_DATE to use the following format : DD/MM/YYYY HH:MM:SS

Config file look like

input {
file {
path => "/tmp/radius_p2016*.txt"
type => "radius_log_test"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter {
csv {
columns => ["USERNAME","ACCT_SESSION","IP",START_DATE,END_DATE,.........,"TERM_CODE"]
separator => ","
}

date {
      match => ["END_DATE", "DD/MM/YYYY HH:MM:SS"]
      target => "END_DATE"
     }
     
date {
      match => ["START_DATE", "DD/MM/YYYY HH:MM:SS"]
      target => "START_DATE"
     }

}

output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "radius-test"
}

stdout {
codec => rubydebug
}
}

And the error message looks like

Failed parsing date from field {:field=>"START_DATE", :value=>"18/10/2016 01:20:49", :exception=>"Cannot parse "18/10/2016 01:20:49": Value 20 for monthOfYear must be in the range [1,12]", :config_parsers=>"DD/MM/YYYY HH:MM:SS", :config_locale=>"default=en_US", :level=>:warn}
Failed parsing date from field {:field=>"END_DATE", :value=>"18/10/2016 01:23:21", :exception=>"Cannot parse "18/10/2016 01:23:21": Value 23 for monthOfYear must be in the range [1,12]", :config_parsers=>"DD/MM/YYYY HH:MM:SS", :config_locale=>"default=en_US", :level=>:warn}
.....
Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"radius-test", :_type=>"radius_log_test", :_routing=>nil},
#<LogStash::Event:0x54aba0ac @metadata={"path"=>"/tmp/radius_p20161018_2.txt"},
@accessors=#<LogStash::Util::Accessors:0x32e463ef
@store={"message"=>"0355349081,JJJ10JJJJJ58024A03-58054574,1337101124,18/10/2016 01:23:21,18/10/2016 04:41:09,7750-XXXX-XX8097 lag-171:349.90,,3558425636,XXXX,150019,Start,3568549330,2481677921,int9,RB-9,Iint,0,0,100000,,,1476740469,RADIUS,,,,PPP,Framed-User,31949774396,0\r", "@version"=>"1", "@timestamp"=>"2016-10-22T18:57:39.150Z"
.....

Please advise

Regards

  match => ["END_DATE", "DD/MM/YYYY HH:MM:SS"]

"MM" can't mean both month and minute. Have a look at DateTimeFormat (Joda time 2.2 API) for a table of available format tokens.

Hi magnusbaeck ,

Thanks for your previous feedback.

It seems that i am missing here somthing.

You said that : "MM cant mean both month and minute"

As you can see bellow i have changed the format from : DD/MM/YYYY HH:MM:SS to
dd-mm-yyyy hh:mi:ss and there is not problem with the format.
How do you explain that ?

Regards

[elastic@ptktl-elk ~]$ curl -XPUT "http://localhost:9200/yoavmyindex3" -d'

{
"mappings": {
"my_type": {
"properties": {
"start_date": {
"type": "date",
"format": "dd-MM-yyyy HH:mm:ss"
}
}
}
}
}

'
{"acknowledged":true}[elastic@ptktl-elk ~]$
[elastic@ptktl-elk ~]$
[elastic@ptktl-elk ~]$ curl -XGET 'http://localhost:9200/yoavmyindex3/_mapping?pretty'
{
"yoavmyindex3" : {
"mappings" : {
"my_type" : {
"properties" : {
"start_date" : {
"type" : "date",
"format" : "dd-MM-yyyy HH:mm:ss"
}
}
}
}
}
}
[elastic@ptktl-elk ~]$

I'm talking about your date filter, not your mappings. Your date pattern "DD/MM/YYYY HH:MM:SS" is wrong because you have "MM", which means month, in two places. "DD" and "SS" is also wrong.