I'm indexing logs from a server to elastic search. The timestamp of the logs is the format
2018-09-18 11:41:01,648160559
The grok which I am using to match it is
%{TIMESTAMP_ISO8601:logTimestamp}
Since I want the logs to be indexed on the logtimestamp, i am using the date plugin as
date
{
match => ["logTimestamp", "YYYY-MM-dd HH:mm:ss,SSSSSSSSS"]
target => "logTimestamp"
}
But it is giving me an exception on logstash.
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [logTimestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2018-09-18 10:42:59,951000000" is malformed at " 10:42:59,951000000""}}}}}
Elasticsearch only supports millisecond timestamps (3 decimals), so I believe you need to capture the last 6 digits of the timestamp in a separate field.
That does not match the pattern you have, so it is not surprising it fails. You could try adding a second grok filter to do the parsing like in this example which copies the last 6 decimals to a field named nanos:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.