Hi Everyone,
Using logstash version 2.4.1
I'm having a lot of trouble getting the first row in my CSV dropped. My filter keeps picking it up and trying to convert it to a date.
Here is the filter portion my config:
filter {
csv {
separator => ","
columns => [ "device_product", "timestamp", "count" ]
}
if ("timestamp" == "Hour(TimeStamp)") {
drop { }
}
else {
date {
match => [ "timestamp", "YYYY-MM-dd HH" ]
}
}
mutate {
remove_field => ["timestamp", "message" ]
convert => { "count" => "integer" }
}
}
Whenever I start up logstash I keep getting the following error:
Failed parsing date from field {:field=>"timestamp", :value=>"Hour(TimeStamp)", :exception=>"Invalid format: "Hour(TimeStamp)"", :config_parsers=>"YYYY-MM-dd HH", :config_locale=>"default=en_US", :level=>:warn}
It's not skipping the first line and keeps trying to convert the timestamp to a date. Now, I've only tried a million different variations of this to try and get it working, but I'm hoping someone out there knows something I don't.
Here is some sample data:
Device Product,Hour(TimeStamp),Sum(Aggregated Event Count)
AIX Audit,2017-03-06 20,19
ASA,2017-03-06 20,15658116
As you can see the timestamp field equals "Hour(TimeStamp)" so it should be skipped per the conditional I specified above. Also notice the dates are for March 6th. The image below shows the data uploaded in Kibana. Notice how the first row messes up and inserts it on the 8th since it trys to parse the first line.
I really appreciate any help I can get on this.