i have an oracle log file and i need to parse the date in a field named backup_start
the date format in this field is like this 04-12-2017 20:30
and my config is
date {
match => ["backup_start", "dd-MM-yyyy HH:mm"]
timezone => "Europe/Paris"
}
All the dates in this field belong to this month, while checking my kibana all the dates in this field belong to today. i checked my logstash log file there is no error or warning? what is the problem should my date field contains seconds also ?
{:timestamp=>"2017-12-14T14:37:38.661000+0100", :message=>"CAUTION: Recommended inflight events max exceeded! Logstash will run with up to 16000 events in memory in your current configuration. If your message sizes are large this may cause instability with the default heap size. Please consider setting a non-standard heap size, changing the batch size (currently 2000), or changing the number of pipeline workers (currently 8)", :level=>:warn}
{:timestamp=>"2017-12-14T14:37:38.683000+0100", :message=>"Pipeline main started"}
{:timestamp=>"2017-12-14T14:39:13.897000+0100", :message=>"Error parsing csv", :field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>, :level=>:warn}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.