Date Format is Malformed

I'm indexing logs from a server to elastic search. The timestamp of the logs is the format
2018-09-18 11:41:01,648160559
The grok which I am using to match it is
%{TIMESTAMP_ISO8601:logTimestamp}
Since I want the logs to be indexed on the logtimestamp, i am using the date plugin as
date
{
match => ["logTimestamp", "YYYY-MM-dd HH:mm:ss,SSSSSSSSS"]
target => "logTimestamp"
}
But it is giving me an exception on logstash.

"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [logTimestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2018-09-18 10:42:59,951000000" is malformed at " 10:42:59,951000000""}}}}}

Elasticsearch only supports millisecond timestamps (3 decimals), so I believe you need to capture the last 6 digits of the timestamp in a separate field.

I am trying
date
{
match => ["logTimestamp", "YYYY-MM-dd HH:mm:ss,SSS00000"]
target => "logTimestamp"
}

But it is still giving me the same error

That does not match the pattern you have, so it is not surprising it fails. You could try adding a second grok filter to do the parsing like in this example which copies the last 6 decimals to a field named nanos:

grok {
  match => {
    "logTimestamp" => ["%{TIMESTAMP_ISO8601:logTimestamp}(?<nanos>\d{6})"]
  }
  overwrite =>["logTimestamp"]
}

I'm still getting the same error. Also I am pretty sure it was working before this fix. Is there something else which might be causing the problem?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.