Hello
My current platform is composed by filebeat->logstash->elasticsearch->Kibana (everything in 7.3 version) and I have configured a grok in logstash, the grok extract the time from the log and put it in the @timestamp field.
Until 2 days ago everything was working fine but suddenly I started to receive this message in logstash module:
Aug 23 21:43:04 hostname logstash[2015]: [2019-08-23T21:43:04,231][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"index", :_type=>"_doc", :routing=>nil}, #LogStash::Event:0x4903369f], :response=>{"index"=>{"_index"=>"index", "_type"=>"_doc", "_id"=>"kONuwGwB1C-6uPvz0ZdP", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [timestamp] of type [date] in document with id 'kONuwGwB1C-6uPvz0ZdP'. Preview of field's value: '2019-08-23 21:42:41,863'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2019-08-23 21:42:41,863] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
this is my logstash grok:
input {
beats {
port => 5044
}
}
filter {
grok {
match => ["message", "%{{TIMESTAMP_ISO8601:timestamp}"]
}
date {
match => ["timestamp", "ISO8601"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
Please your help if maybe one of you got the same error in the past and how could fix it.
Thanks
Regards