Date parsing failed

Hello

i have an oracle log file and i need to parse the date in a field named backup_start
the date format in this field is like this 04-12-2017 20:30

and my config is

date {
                  match => ["backup_start", "dd-MM-yyyy HH:mm"]
                  timezone => "Europe/Paris"
}

All the dates in this field belong to this month, while checking my kibana all the dates in this field belong to today. i checked my logstash log file there is no error or warning? what is the problem should my date field contains seconds also ?

thank you!

Please show an example event produced by Logstash. Copy/paste from Kibana's JSON tab.

actually it worked finally although i didn't do anythting but i have an error in kibana(tab) in

[tags]: beats_input_codec_plain_applied, _csvparsefailure

and in logstash logfile

 {:timestamp=>"2017-12-14T14:37:38.661000+0100", :message=>"CAUTION: Recommended inflight events max exceeded! Logstash will run with up to 16000 events in memory in your current configuration. If your message sizes are large this may cause instability with the default heap size. Please consider setting a non-standard heap size, changing the batch size (currently 2000), or changing the number of pipeline workers (currently 8)", :level=>:warn}
{:timestamp=>"2017-12-14T14:37:38.683000+0100", :message=>"Pipeline main started"}
{:timestamp=>"2017-12-14T14:39:13.897000+0100", :message=>"Error parsing csv", :field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>, :level=>:warn}

i am using filebeat to send logs to logstash

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.