Failed date from field

I have the conf file reading data but still getting exceptions to my CSV Timestamp issues. Is this the proper ISO8601 format? 2016-02-21T04:15:04.290

Failed parsing date from field {:field=>"mydatetime", :value=>"2016-02-21T04:15:04.290", :exception=>"Invalid format: "2016-02-21T04:15:04.290" is malformed at "T04:15:04.290"", :config_parsers=>"yyyy-MM-dd HH:mm:ss.SSS", :config_locale=>"en", :level=>:warn}

Other notes:

So you're attempting to parse the string "2016-02-21T04:15:04.290" with the pattern "yyyy-MM-dd HH:mm:ss.SSS"? That won't work because your pattern doesn't consider the presence of the "T" between the date and the time. Use "yyyy-MM-dd'T'HH:mm:ss.SSS" instead.

Failed parsing date from field {:field=>"mydatetime", :value=>"2016-02-21T04:2016-02-21", :exception=>"Invalid format: "2016-02-21T04:2016-02-21" is malformed at "16-02-21"", :config_parsers=>"yyyy-MM-dd'T'HH:mm:ss.SSS", :config_locale=>"en", :level=>:warn}

Here is what I currently have:
input {
file {
path => "/home/bkelley6/flights/*.csv"
type => "flights"
start_position => "beginning"
}
}

filter {
csv {
columns => ["Date", "Time", "SWR", "RSSI(dB)", "RxBt(V)", "Cels(gRe)", "Tmp2(@C)", "RPM(rpm)", "Tmp1(@C)", "Rud", "Ele", "Thr", "Ail", "S1", "S2", "S3", "LS", "RS", "SA", "SB", "SC", "SD" ,"SE", "SF", "SG", "SH"]
separator => ","
}

mutate {
replace => [ "mydatetime", "%{Date}T%{Time}" ]
}

date {
locale => "en"
match => [ "mydatetime", "yyyy-MM-dd'T'HH:mm:ss.SSS"]
timezone => "America/New_York"
target => ["@timestamp"]
}
}
output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
}

sample of the CSV file:

Date,Time,SWR,RSSI(dB),RxBt(V),Cels(gRe),Tmp2(@C),RPM(rpm),Tmp1(@C),Rud,Ele,Thr,Ail,S1,S2,S3,LS,RS,SA,SB,SC,SD,SE,SF,SG,SH,
2016-02-21,04:11:14.640,30,75,5.2,23.4,0,0,0,0,0,0,0,0,0,0,0,0,-1,-1,-1,-1,-1,-1,-1,-1,
2016-02-21,04:11:14.840,30,75,5.2,23.4,0,0,0,0,0,0,0,0,0,0,0,0,-1,-1,-1,-1,-1,-1,-1,-1,

Thanks for all your help.

Works fine for me.

$ cat test.config
filter {
csv {
columns => ["Date", "Time", "SWR", "RSSI(dB)", "RxBt(V)", "Cels(gRe)", "Tmp2(@C)", "RPM(rpm)", "Tmp1(@C)", "Rud", "Ele", "Thr", "Ail", "S1", "S2", "S3", "LS", "RS", "SA", "SB", "SC", "SD" ,"SE", "SF", "SG", "SH"]
separator => ","
}

mutate {
replace => [ "mydatetime", "%{Date}T%{Time}" ]
}

date {
locale => "en"
match => [ "mydatetime", "yyyy-MM-dd'T'HH:mm:ss.SSS"]
timezone => "America/New_York"
target => ["@timestamp"]
}
}
input { stdin {} }
output { stdout { codec => rubydebug } }
$ echo '2016-02-21,04:11:14.640,30,75,5.2,23.4,0,0,0,0,0,0,0,0,0,0,0,0,-1,-1,-1,-1,-1,-1,-1,-1,' | /opt/logstash/bin/logstash -f test.config
Settings: Default pipeline workers: 2
Logstash startup completed
{
       "message" => "2016-02-21,04:11:14.640,30,75,5.2,23.4,0,0,0,0,0,0,0,0,0,0,0,0,-1,-1,-1,-1,-1,-1,-1,-1,",
      "@version" => "1",
    "@timestamp" => "2016-02-21T09:11:14.640Z",
          "host" => "hallonet",
          "Date" => "2016-02-21",
          "Time" => "04:11:14.640",
           "SWR" => "30",
      "RSSI(dB)" => "75",
       "RxBt(V)" => "5.2",
     "Cels(gRe)" => "23.4",
      "Tmp2(@C)" => "0",
      "RPM(rpm)" => "0",
      "Tmp1(@C)" => "0",
           "Rud" => "0",
           "Ele" => "0",
           "Thr" => "0",
           "Ail" => "0",
            "S1" => "0",
            "S2" => "0",
            "S3" => "0",
            "LS" => "0",
            "RS" => "0",
            "SA" => "-1",
            "SB" => "-1",
            "SC" => "-1",
            "SD" => "-1",
            "SE" => "-1",
            "SF" => "-1",
            "SG" => "-1",
            "SH" => "-1",
      "column27" => nil,
    "mydatetime" => "2016-02-21T04:11:14.640"
}
Logstash shutdown completed

Would it cause an issue if I am taking in the first line?

Brad

Yes, but not that kind of error.

when I add the output to the conf:

output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
}

I get the error messages?

Failed parsing date from field {:field=>"mydatetime", :value=>"2016-02-21T04:2016-02-21", :exception=>"Invalid format: "2016-02-21T04:2016-02-21" is malformed at "16-02-21"", :config_parsers=>"yyyy-MM-dd'T'HH:mm:ss.SSS", :config_locale=>"en", :level=>:warn}

The debug worked great for me too!!

Not sure that is the issue?

Thanks for your help,
Brad

Do I need to remove the mydatetime variable?
Thanks,
Brad

I'm not sure what's going on here, but given that you parse the timestamp into @timestamp I don't see why you'd want to keep mydatetime.

Failed parsing date from field {:field=>"mydatetime", :value=>"2016-02-21T17:2016-02-21", :exception=>"Invalid format: "2016-02-21T17:2016-02-21" is malformed at "16-02-21"", :config_parsers=>"yyyy-MM-dd'T'HH:mm:ss.SSS", :config_locale=>"en", :level=>:warn}

this error is only when i output to elasticsearch?
output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
}

Little more information and some big hints to the problem.

I wrote a little different config file (notice the input method):
filter {
csv {
columns => ["Date", "Time", "SWR", "RSSI(dB)", "RxBt(V)", "Cels(gRe)", "Tmp2(@C)", "RPM(rpm)", "Tmp1(@C)", "Rud", "Ele", "Thr", "Ail", "S1", "S2", "S3", "LS", "RS", "SA", "SB", "SC", "SD" ,"SE", "SF", "SG", "SH"]
separator => ","
}

mutate {
replace => [ "mydatetime", "%{Date}T%{Time}" ]
}

date {
locale => "en"
match => [ "mydatetime", "yyyy-MM-dd'T'HH:mm:ss.SSS"]
timezone => "America/New_York"
target => ["@timestamp"]
remove_field => ["mydatetime"]
}
}

input { stdin {} }
output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
}

And wrote this one liner to process one line at a time.

cat /home/bkelley6/flights/T-Rex_500-2016-02-21-1-test.csv | while read line ; do echo $line | /opt/logstash/bin/logstash -f magnus-elastic.conf; done

This works!

So I am thinking the problem is with the input on the the other conf file:

input {
file {
path => "/home/bkelley6/flights/*.csv"
type => "flights"
start_position => "beginning"
}
}

Why would this be causing a problem?

Once again thank you for all your help.