Is there anything special I need to put into the conf or logstash startup to get more debug output from the date parser?
I've tried with the logstash set with log.level=trace, still isn't helping me. I also changed output to codec=>"rubydebug" and that isn't helping me.
Sample line from /var/log/logstash-plain.log
[2017-04-27T16:30:02,512][DEBUG][logstash.pipeline ] output received {"event"=>{"collection_date"=>2017-04-27T20:00:00.000Z, "system_date"=>2017-04-27T20:28:00.000Z, "ne_name"=>"YXCO41_FELDZ77-01", "ne_model"=>"Z77", "finished"=>1, "type"=>"CollectorStatus", "collector_interval"=>900, "active_hdl"=>"19b", "tags"=>["NetOptimizeCollectorStatus", "cyan", "_dateparsefailure"], "collection_status"=>"Finished", "@timestamp"=>2017-04-27T20:30:02.502Z, "abort"=>0, "collector_name"=>"YXCO41_FELDZ77-01", "@version"=>"1", "rpu_hostname"=>"sapl19", "ne_vendor"=>"Cyan"}}
Sample line in the rubydebug output.
{
"collection_date" => 2017-04-27T20:45:00.000Z,
"system_date" => 2017-04-27T21:12:00.000Z,
"ne_name" => "TRHLPAXTO02",
"ne_model" => "cyan-Z33",
"finished" => 1,
"type" => "CollectorStatus",
"collector_interval" => 900,
"active_hdl" => "20a",
"tags" => [
[0] "NetOptimizeCollectorStatus",
[1] "cyan",
[2] "_dateparsefailure"
],
"collection_status" => "Finished",
"@timestamp" => 2017-04-27T21:15:01.796Z,
"abort" => 0,
"collector_name" => "TRHLPAXTO02",
"@version" => "1",
"rpu_hostname" => "sapl20",
"ne_vendor" => "Cyan"
}
I also searched previous post and found the online tester site https://joda-time-parse-debugger.herokuapp.com/
My date seems to work with the pattern just fine.
I'm using the JDBC input for logstash, so is maybe the field "system_date" not getting into the Date parser?
I had multiple conf files before but condensed it to just one for trying to solve this. Below is my ONLY conf file for logstash now. (Note I did alter some fields in JDBC since this is public forum.)
The # character at the beginning of a line indicates a comment.
Use comments to describe your configuration.
input {
jdbc {
jdbc_connection_string => "jdbc:oracle:thin:@//fwsd01:1531/PROD"
jdbc_driver_library => "/usr/share/logstash/jdbc_drivers/oracle-jdbc6.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_password => "xxxxxxxx"
jdbc_user => "xxxxxxxx"
jdbc_validate_connection => "true"
#parameters => { "table" => "timezone" }
schedule => "*/15 * * * *"
#sql_log_level => "debug"
statement_filepath => "/usr/share/logstash/DB_queries/c_status.sql"
record_last_run => "false"
tags => ["valid","NetOptimizeCollectorStatus","cyan"]
type => "CollectorStatus"
}
}
The filter part of this file is commented out to indicate that it is optional.
filter {
if "CollectorStatus" == [type] {
date {
match => ["system_date","YYYY-MM-dd'T'HH:mm:ss.SSS'Z'","ISO8601"]
timezone => "Etc/UTC"
target => "@timestamp"
}
}
if "valid" not in [tags] {
drop { }
}
mutate {
remove_tag => [ "valid" ]
}
}
output {
if "CollectorStatus" == [type] {
elasticsearch {
hosts => ["10.112.91.113:9200"]
id => "collector_status"
index => "collector_status_%{+YYYY.MM.dd}"
}
}
file { path => "/tmp/logstash/stdout_%{+YYYY.MM.dd}"
codec => "rubydebug"
flush_interval => 0
}
}