[2020-05-22T09:04:47,691][WARN ][logstash.outputs.amazonelasticsearch] Could not index event to Elasticsearch

[2020-05-22T09:04:47,691][WARN ][logstash.outputs.amazonelasticsearch]
Could not index event to Elasticsearch. {:status=>400, :action=>
["index", {:_id=>nil, :_index=>"****-2020.05.22",
:_type=>"_doc", :_routing=>nil}, #LogStash::Event:0x340ebf9d], :response=>{"index"=>
{"_index"=>"
-2020.05.22", "_type"=>"_doc", "_id"=>"xlmgO3IBnbVUsk86tcKj",
"status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [logtimeStamp

log content

INFO {2020-05-22 08:04:40,205} [pool-1-thread-1] (IpePolicy.java:160) -(type:**********) :
fabric:
name: awsCloud
displayName: Cloud Databus

grok {
match => {"message" => "%{LOGLEVEL:loglevel} {%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second},%{NUMBER:millis}} %{GREEDYDATA:message}"}
overwrite => [ "message"]
}
mutate {
add_field => {"logtimeStamp" => "%{year}/%{month}/%{day} %{hour}:%{minute}:%{second}:%{millis}"}
}

with the above filters I was recieving the warn mentioned above.

After adding the below the issue resolved.
I want to understand if we are parsing the date from logfile.
date filter is mandatory ? and on based what the below stanza is written.

How did the below resolve my issue
date {
match => [ "logtimeStamp", "dd MMM yyyy HH:mm:ss" ]
locale => "en"
remove_field => [ "logtimestamp", "year", "month", "day", "hour", "minute", "second", "millis" ]
}

You are likely getting that mapping _parser_exception because the mapping definition of the field logtimeStamp is different than the format that gets created in the mutate add_field block. You then fix that problem when you use the date filter to parse it in a format which then matches the mapping definition.