Logstash doesn't parse milliseconds with more than 4 digits

Hello. I have logstash 6.6.1 version and conf file described below.

input {
  file {
    start_position => "beginning"
    path => "/logs/log.json"
    type => json
  }
}
filter {
        mutate {
                rename => ["message","log_message"]
        }
        grok {
                match => { log_message => "{\"timestamp\":\"%{TIMESTAMP_ISO8601:date}\",\"severity\":\"%{WORD:severity}\",\"order_id\":%{GREEDYDATA:order_id},\"data\":%{GREEDYDATA:data},\"message\":\"%{GREEDYDATA:message}\",\"log_id\":\"%{GREEDYDATA:log_id}\",\"friendly_order_id\":%{GREEDYDATA:friendly_order_id},\"client\":%{GREEDYDATA:client}}" }
        }
        mutate {
                remove_field => ["log_message"]
                gsub => ["client", "\"", ""]
                gsub => ["friendly_order_id", "\"", ""]
                gsub => ["order_id", "\"", ""]
        }

date {
match => [ "date", "YYYY-MM-dd'T'HH:mm:ss,SSSS+00:00", "yyyy-MM-dd'T'HH:mm:ss.SSSSSS+00:00"]
target => "newtimestamp5"
}

}

}

output {
        elasticsearch {
                index => "tanuki-prod-%{+YYYY.MM.dd}"
                hosts => ["172.17.0.1:9200"]
        }

}

Log lines starts with :

{"timestamp":"2019-06-28T07:34:13.624700+00:00","severity":"info"...

I'm trying to parse millieconds with 4 digits with

date {
match => [ "date", "YYYY-MM-dd'T'HH:mm:ss,SSSS+00:00", "yyyy-MM-dd'T'HH:mm:ss.SSSSSS+00:00"]
target => "newtimestamp5"
}

}

and receive such log:

{
"_index": "tanuki-2019.06.28",
"_type": "doc",
"_id": "2KeanWsBY5RZea3l51fI",
"_version": 1,
"_score": null,
"_source": {
"newtimestamp": "2019-06-28T10:21:14.197Z",
"severity": "info",
"timestamp": "2019-06-28T10:21:14.197800+00:00",
"message": "Outgoing response",
"type": "json",
"client": "OMS",
"host": "tanuki",
"date": "2019-06-28T10:21:14.197800+00:00",
"log_id": "c4bd425e-bfd9-4a71-9b31-4a594c700710",
"data": "{"code":200,"headers":["Content-Type: application\/json;charset=UTF-8"],"body":"{\"result\":{\"successful\":true,\"code\":200},\"payload\":{\"order_id\":\"86e2b9c5-5e28-4f6f-9c03-f776b464ee4c\",\"message\":\"Promo-code you provided is not valid. Your order was registered anyway though.\",\"friendly_order_id\":\"10000095\"}}","Request duration":"1.353"}",
"@version": "1",
"@timestamp": "2019-06-28T10:21:42.654Z",
"order_id": "",
"friendly_order_id": "null",
"path": "/logs/log.json"
},
"fields": {
"date": [
"2019-06-28T10:21:14.197Z"
],
"@timestamp": [
"2019-06-28T10:21:42.654Z"
]
},
"sort": [
1561717274197
]
}

But for my new field "newtimestamp" i see message "no cached mapping for this field" and it's only 3 digits, 2019-06-28T10:21:14.197Z. How can I fix it ?

Thank you much in advance!

You cannot fix it. The Logstash::TimeStamp type only stores millisecond precision. You would have to save the sub-millisecond part of the timestamp in a different field.

Thanks for a reply. Solved it using add_field => { "newtimestamp" => "%{date}"} and adding this field in kibana

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.