CSV Timstamp issues

Here is my conf file, and it seems to be working ok, except I need the timestamp to be the actually time in the csv file. Not the time the file was added. What am i doing wrong.... total Noob here. Sorry.

input {
file {
path => "/home/bkelley6/flights/*.csv"
type => "flights"
start_position => "beginning"
}
}

filter {
csv {
columns => ["Date", "Time", "SWR", "RSSI(dB)", "RxBt(V)", "Cels(gRe)", "Tmp2(@C)", "RPM(rpm)", "Tmp1(@C)", "Rud", "Ele", "Thr", "Ail", "S1", "S2", "S3", "LS", "RS", "SA", "SB", "SC", "SD" ,"SE", "SF", "SG", "SH"]
separator => ","
}

mutate {
replace => [ "date", "%{Date} %{Time}" ]
}
}
output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
}

File sample

Date,Time,SWR,RSSI(dB),RxBt(V),Cels(gRe),Tmp2(@C),RPM(rpm),Tmp1(@C),Rud,Ele,Thr,Ail,S1,S2,S3,LS,RS,SA,SB,SC,SD,SE,SF,SG,SH,
2016-02-21,04:11:14.640,30,75,5.2,23.4,0,0,0,0,0,0,0,0,0,0,0,0,-1,-1,-1,-1,-1,-1,-1,-1,

Here is what I am getting.

You then need to do something like;

  date {
    match => [ "date", "dd/MM/YYYY hh:mm:ss a" ]
    timezone => "YOUR TZ HERE"
    target => "date"
  }

That'll properly match the value. If you want to replace the @tiemstamp then just change the target value, then you may want to drop the original date field

So I would put that after the mutate, or in place of it?

Thanks,
Brad

After the mutate, sorry!

Thank you,

input {
file {
path => "/home/bkelley6/flights/*.csv"
type => "flights"
start_position => "beginning"
}
}

filter {
csv {
columns => ["Date", "Time", "SWR", "RSSI(dB)", "RxBt(V)", "Cels(gRe)", "Tmp2(@C)", "RPM(rpm)", "Tmp1(@C)", "Rud", "Ele", "Thr", "Ail", "S1", "S2", "S3", "LS", "RS", "SA", "SB", "SC", "SD" ,"SE", "SF", "SG", "SH"]
separator => ","
}

mutate {
replace => [ "date", "%{Date} %{Time}" ]
}
}
date {
match => [ "date", "dd/MM/YYYY hh:mm:ss a" ]
timezone => "America/New_York"
target => "@tiemstamp"
}

output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
}

misspelled timestamp

Failed parsing date from field {:field=>"date", :value=>"2016-02-21 17:17:57.440", :exception=>"Invalid format: "2016-02-21 17:17:57.440" is malformed at "16-02-21 17:17:57.440"", :config_parsers=>"dd/MM/ YYYY hh:mm:ss a", :config_locale=>"default=en_US", :level=>:warn}

Oh sorry, that was a time format I was using for something else.

Change the match pattern to ISO8601.

Here is what I currently have, still not taking the date format.

input {
file {
path => "/home/bkelley6/flights/*.csv"
type => "flights"
start_position => "beginning"
}
}

filter {
csv {
columns => ["Date", "Time", "SWR", "RSSI(dB)", "RxBt(V)", "Cels(gRe)", "Tmp2(@C)", "RPM(rpm)", "Tmp1(@C)", "Rud", "Ele", "Thr", "Ail", "S1", "S2", "S3", "LS", "RS", "SA", "SB", "SC", "SD" ,"SE", "SF", "SG", "SH"]
separator => ","
}

mutate {
replace => [ "date", "%{Date} %{Time}" ]
}

date {
match => [ "date", "YYYY-MM-dd;HH:mm:ss.SSS", "ISO8601" ]
timezone => "America/New_York"
target => "@timestamp"
}
}
output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
}

Still not parsing. I am at a loss.

What's the date field actually look like in the json?

Here are the first two lines of the csv file I am trying to bring in, the first two records are the date and time.
comma delimited.

Date,Time,SWR,RSSI(dB),RxBt(V),Cels(gRe),Tmp2(@C),RPM(rpm),Tmp1(@C),Rud,Ele,Thr,Ail,S1,S2,S3,LS,RS,SA,SB,SC,SD,SE,SF,SG,SH,
2016-02-21,04:11:14.640,30,75,5.2,23.4,0,0,0,0,0,0,0,0,0,0,0,0,-1,-1,-1,-1,-1,-1,-1,-1

This is what i have now.

input {
file {
path => "/home/bkelley6/flights/*.csv"
type => "flights"
start_position => "beginning"
}
}

filter {
csv {
columns => ["Date", "Time", "SWR", "RSSI(dB)", "RxBt(V)", "Cels(gRe)", "Tmp2(@C)", "RPM(rpm)", "Tmp1(@C)", "Rud", "Ele", "Thr", "Ail", "S1", "S2", "S3", "LS", "RS", "SA", "SB", "SC", "SD" ,"SE", "SF", "SG", "SH"]
separator => ","
}

mutate {
replace => [ "date", "%{Date} %{Time}" ]
}

date {
"locale" => "en"
match => [ "date", "YYYY-MM-dd hh:mm:ss.SSS" ]
timezone => "America/New_York"
target => "@timestamp"
}
}
output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
}

Anyone?

I think you need to start at the basics, with the CSV filter, and go from there. I did that and I can see a _csvparsefailure tag.

Little more information and some big hints to the problem.

I wrote a little different config file (notice the input method):
filter {
csv {
columns => ["Date", "Time", "SWR", "RSSI(dB)", "RxBt(V)", "Cels(gRe)", "Tmp2(@C)", "RPM(rpm)", "Tmp1(@C)", "Rud", "Ele", "Thr", "Ail", "S1", "S2", "S3", "LS", "RS", "SA", "SB", "SC", "SD" ,"SE", "SF", "SG", "SH"]
separator => ","
}

mutate {
replace => [ "mydatetime", "%{Date}T%{Time}" ]
}

date {
locale => "en"
match => [ "mydatetime", "yyyy-MM-dd'T'HH:mm:ss.SSS"]
timezone => "America/New_York"
target => ["@timestamp"]
remove_field => ["mydatetime"]
}
}

input { stdin {} }
output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
}

And wrote this one liner to process one line at a time.

cat /home/bkelley6/flights/T-Rex_500-2016-02-21-1-test.csv | while read line ; do echo $line | /opt/logstash/bin/logstash -f magnus-elastic.conf; done

This works!

So I am thinking the problem is with the input on the the other conf file:

input {
file {
path => "/home/bkelley6/flights/*.csv"
type => "flights"
start_position => "beginning"
}
}

Why would this be causing a problem?

Once again thank you for all your help.

Maybe there is some weird formatting in the file that isn't immediately obvious?

Verify that your csv file has a CRLF at end of each line. You could open in Notepad ++ to validate.
So, nothing shows up in Elastic now, where earlier in your first post you were getting data in ES?