I have log files from where I've to query documents through the date field. Here is the log date format.
2020-06-10 14:01:26
I've defined a grok pattern like below in Logstash for the log.
filter {
grok {
match => {"message" => "%{YEAR:YEAR}-%{MONTHNUM:MONTHNUM}-%{MONTHDAY:MONTHDAY} %{TIME:time} ........."
}}
mutate {
add_field => {
"log_timestamp" => "%{YEAR}-%{MONTHNUM}-%{MONTHDAY}"
}}
Date field log_timestamp is working as expected but I've been receiving Logstash error message frequently because of this grok pattern. In a day, if there's a log document of around 100 then 10 logs will be discarded because of this error. Is there any better way to parse the log file and get the date field?
Error msg:
[2020-06-10T06:43:18,756][WARN ][logstash.outputs.elasticsearch][log] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"log-2020-06", :_type=>"_doc", :routing=>nil}, #LogStash::Event:0x7f4158ff], :response=>{"index"=>{"_index"=>"log-2020-06", "_type"=>"_doc", "_id"=>"KSSvm3IBbBjRTTrvQZx8", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [log_timestamp] of type [date] in document with id 'KSSvm3IBbBjRTTrvQZx8'. Preview of field's value: '%{YEAR}-%{MONTHNUM}-%{MONTHDAY}'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [%{YEAR}-%{MONTHNUM}-%{MONTHDAY}] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}