Date Parse failures

I have tried opening an official support case without getting any help on the following topic. Hoping someone here can help....

I have several log sources with a date field like this: Mar 22 2020 21:52:00

In my Index Template for the field where this date is parsed, I have tried the following date format with and without quotes: MMM dd yyyy HH:mm:ss. The Advanced Options tab in Kibana does not permit spaces in the date format and therefore required quotes around the format. The field mapping Set Format field does not require quotes.

I have also tried specifying the format within a pipeline filter date section. Example:

date {
            match => [ "dateField", "MMM d yyyy HH:mm:ss", "MMM dd yyyy HH:mm:ss" ]
            timezone => "America/New_York"
            }

Any thoughts?

When I run

input { generator { count => 1 lines => [ '' ] } }
filter {
    mutate { add_field => { "dateField" => "Mar 22 2020 21:52:00" } }
    date {
        match => [ "dateField", "MMM d yyyy HH:mm:ss", "MMM dd yyyy HH:mm:ss" ]
        timezone => "America/New_York"
    }
}
output  { stdout { codec => rubydebug { metadata => false } } }

I get

"@timestamp" => 2020-03-23T01:52:00.000Z,

which looks right to me. Do you want to change [dateField] to be a Logstash::Timestamp type?

date {
match => [ "dateField", "MMM d yyyy HH:mm:ss", "MMM dd yyyy HH:mm:ss" ]
timezone => "America/New_York"
target => "dateField"
}

I guess you missing target =>

@elasticforme I apologize, I just realized that I forgot to include the actual question which is, Why is Logstash unable to parse the date string?

I did not specify a target as I want that value to be the value for @timestamp which is the default. So target, is not the concern.

[2020-08-28T16:15:40,973][WARN ][logstash.outputs.elasticsearch][PIPELINENAME][377cf5ec8a0e2892ceab3f8195eae7e7343bce8263b70dc7a449de4c446dce17] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"PIPELINENAME", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x76cf0ebd>], :response=>{"index"=>{"_index"=>"PIPELINENAME-000001", "_type"=>"_doc", "_id"=>"c2S2NnQBAHcBNctHJcKn", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [dateField] of type [date] in document with id 'c2S2NnQBAHcBNctHJcKn'. Preview of field's value: 'Mar 22 2020 21:52:00'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [Mar 22 2020 21:52:00] with format [strict_date_optional_time||epoch_millis||\"MMM dd yyyy HH:mm:ss\"]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}

@Badger I am not sure what you mean by "Logstash::Timestamp type"?

Sorry, I should have worded that differently. I was asking if you wanted dateField to remain a string, or whether you wanted to set the target option on the date filter to make the filter convert it to the same type that @timestamp has.

Can you try using

output { stdout { codec => rubydebug } }

and show us what [dateField] looks like. The date pattern has to match the entire field, so an extra space or different punctuation will prevent it matching. Are you getting a _dateparsefailure tag added to the event?

Yes, the _dateparsefailure tag is being attached. Also, in some cases, yes, dateField needs to become @timestamp. In other cases not. I answer this way as some logs have a date/time field to be used for @timestamp while also having supplemental start/end times.

One example is:

I will find another example from one of my problematic sources and post it. I will also enable the rubydebug output and grab an example. Be back shortly.

@Badger, I am not seeing any additional output with the rubydebug output clause. Would this make sense considering that i am not running the logstash process from the command line but rather have it running as a service?

I am iterating through one of our log sources and just pulled the following from the logstash-plain log.
[2020-08-31T16:29:03,403][WARN ][logstash.outputs.elasticsearch][PIPELINENAME][cba9b3f44e356cb5073da38ba0e8c9f96b9327d0b52738970aa47155f06c58a6] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"PIPELINENAME", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x66f5d913>], :response=>{"index"=>{"_index"=>"PIPELINENAME-000001", "_type"=>"_doc", "_id"=>"r4E1RnQBAHcBNctHeOQi", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [dateField] of type [date] in document with id 'r4E1RnQBAHcBNctHeOQi'. Preview of field's value: '2020-08-31 08:02:04'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2020-08-31 08:02:04] with format [strict_date_optional_time||epoch_millis||yyyy-MM-dd HH:mm:ss]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}

If you are running it as a service you could write events to a file using a rubydebug codec.

Preview of field's value: '2020-08-31 08:02:04'", 
"caused_by"=>{"type"=>"illegal_argument_exception", 
"reason"=>"failed to parse date field [2020-08-31 08:02:04] with format 
[strict_date_optional_time||epoch_millis||yyyy-MM-dd HH:mm:ss]"

I cannot think why elasticsearch would fail to parse that value when yyyy-MM-dd HH:mm:ss is in the list of parsers.