Could not index event to Elasticsearch. {:status=>400, :action=> #<LogStash::Event:0x458fc470>], :response=>{"index"=>ception", "reason"=>"failed to parse date field

[2020-05-18T09:02:52,095][WARN ][logstash.outputs.amazonelasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"gppepo-epo-govpp-scheduler-2020.05.18", :_type=>"_doc", :_routing=>nil}, #LogStash::Event:0x458fc470], :response=>{"index"=>{"_index"=>"*************-2020.05.18", "_type"=>"_doc", "_id"=>"3aQFJ3IBhXn70-zjgi-5", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [logtimeStamp] of type [date] in document with id '3aQFJ3IBhXn70-zjgi-5'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [17/May/2020:23:32:43 +0000] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}
[2020-05-18T09:02:52,119][WARN ][logstash.outputs.amazonelasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"g**************-****r-2020.05.18", :_type=>"_doc", :_routing=>nil}, #LogStash::Event:0x458fc470], :response=>{"index"=>{"_index"=>"***************scheduler-2020.05.18", "_type"=>"_doc", "_id"=>"5KQFJ3IBhXn70-zjgi_O", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [logtimeStamp] of type [date] in document with id '5KQFJ3IBhXn70-zjgi_O'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [17/May/2020:23:32:43 +0000] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}

  1. Can we ignore the above warning ignored?
  2. If the events are not parsed so they appear in ELASTIC SEARCH ?
  3. Both the servers have same logstash and ES version. All of then have same logstash conf files only in few I m noticing the above warning.
    please help?
1 Like

elasticsearch is returning this error to logstash. It does not index the document, so you are losing data.

The default date parser in elasticsearch accepts two formats, one is a number of milliseconds since the 1-1-1970 (epoch_millis). For the other "year_month_day format, is mandatory and the time, separated by T , is optional. Examples: yyyy-MM-dd'T'HH:mm:ss.SSSZ or yyyy-MM-dd. 17/May/2020:23:32:43 +0000 does not match either of those, so elasticsearch is unable to parse it.

You need to use a date filter in logstash to parse the field.

1 Like

Thank you Badger for the solution. I have the below in conf file.

if "_grokparsefailure" in [tags] {
mutate {
remove_tag => [ "_grokparsefailure" ]
add_tag => [ " parsefailure "]
}
}
if logstash is not able to parse the logs. They will still appear in kibana right with parsefailure.

1 Like

If logstash is unable to parse the logs they will still appear.

If elasticsearch is unable to parse the logs they will be lost.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.