Illegal_argument_exception: failed to parse date field, date_time_parse_exception

Hey,
I'm currently getting this all the time

[2019-09-09T18:10:24,175][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2019.09.09", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x503c4137>], :response=>{"index"=>{"_index"=>"logstash-2019.09.09", "_type"=>"_doc", "_id"=>"6wDKFm0BsHltw-Q5Xf3S", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [syslog_timestamp] of type [date] in document with id '6wDKFm0BsHltw-Q5Xf3S'. Preview of field's value: 'Sep 9 18:10:07'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [Sep 9 18:10:07] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}

I'm using "syslog_timestamp" in two places in my pipeline

match => {
  "message" => "<%{NONNEGINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{GREEDYDATA:syslog_message_outer}"
}

date {
  match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss", "ISO8601" ]
}

I'm unable to find out which input triggers this. How do I? Logstash debug mode didn't help (too much output).

UPDATE

Alrighty sorry for panicking :smiley:

Logstash error is above

Ruby output:

{
           "syslog_severity_code" => 6,
                "syslog_hostname" => "xx",
                           "tags" => [
...
    ],
    "%_logstash_processed_at" => 2019-09-09T16:10:24.074Z,
                       "@version" => "1",
           "syslog_message_outer" => "snmpd[1647]: message repeated 2 times: [ Connection from UDP: [x.x.x.x]:58475->[x.x.x.x]:161]",
                            "msg" => "message repeated 2 times: [ Connection from UDP: [x.x.x.x]:58475->[x.x.x.x]:161]",
                     "syslog_pid" => "1647",
                        "message" => "<30>Sep  9 18:10:07 xx snmpd[1647]: message repeated 2 times: [ Connection from UDP: [x.x.x.x]:58475->[x.x.x.x]:161]",
                "syslog_facility" => "daemon",
                "syslog_severity" => "informational",
                  "received_from" => "xx",
                     "@timestamp" => 2019-09-09T16:10:07.000Z,
                     "syslog_pri" => 30,
               "syslog_timestamp" => "Sep  9 18:10:07",
                           "port" => 45829,
           "syslog_facility_code" => 3,
                 "syslog_program" => "snmpd",
           "syslog_message_inner" => "message repeated 2 times: [ Connection from UDP: [x.x.x.x]:58475->[x.x.x.x]:161]",
                           "host" => "x.x.x.x"
}

Elastic error: (looking for document 6wDKFm0BsHltw-Q5Xf3S)

[2019-09-09T18:10:24,082][DEBUG][o.e.a.b.TransportShardBulkAction] [log02] [logstash-2019.09.09][0] failed to execute bulk item (index) index {[logstash-2019.09.09][_doc][6wDKFm0BsHltw-Q5Xf3S], source[{"syslog_severity_code":6,"syslog_hostname":"xx","tags":[...],"%_logstash_processed_at":"2019-09-09T16:10:24.074Z","@version":"1","syslog_message_outer":"snmpd[1647]: message repeated 2 times: [ Connection from UDP: [x.x.x.x]:58475->[x.x.x.x]:161]","msg":"message repeated 2 times: [ Connection from UDP: [x.x.x.x]:58475->[x.x.x.x]:161]","syslog_pid":"1647","message":"<30>Sep  9 18:10:07 xxxx snmpd[1647]: message repeated 2 times: [ Connection from UDP: [x.x.x.x]:58475->[x.x.x.x]:161]","syslog_facility":"daemon","syslog_severity":"informational","received_from":"x.x.x.x","@timestamp":"2019-09-09T16:10:07.000Z","syslog_pri":30,"syslog_timestamp":"Sep  9 18:10:07","port":45829,"syslog_facility_code":3,"syslog_program":"snmpd","syslog_message_inner":"message repeated 2 times: [ Connection from UDP: [x.x.x.x]:58475->[x.x.x.x]:161]","host":"x.x.x.x"}]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse field [syslog_timestamp] of type [date] in document with id '6wDKFm0BsHltw-Q5Xf3S'. Preview of field's value: 'Sep  9 18:10:07'
        at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:299) ~[elasticsearch-7.3.1.jar:7.3.1]
        at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:488) ~[elasticsearch-7.3.1.jar:7.3.1]
....

Anyone? :cry:

"Sep 9 18:10:07" is not a recognized date format of Elasticsearch. And suggest you convert these formats ["MMM d HH:mm:ss", "MMM dd HH:mm:ss", "ISO8601"] to one unified format.

?
Logstash date filter is supposed to parse these timestamps and inject a timestamp appropriate for Elasticsearch. Is it not?
(And "Sep 9 18:10:07" very well matches "MMM d HH:mm:ss")

(https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html#plugins-filters-date-match)

Ah I see what you mean.
When I am talking about "date" I am talking about "@timestamp".
But this is about the actual syslog_timestamp field.
Yeah ok, I have to put it into an actual date format recognized by ES and make sure the field mapping is date. For today's index it has switched to "text" and now produces shard failures...
I'll see to get that straightened out by modifying using logstash.

For everyone's reference, this is what I now implemented in Logstash, and it seems to work:

  if [syslog_timestamp] { 
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss", "ISO8601" ]
      target => "syslog_timestamp"
    }
  }

(Or just drop the field if you don't need it)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.