Unable to use the app generated timestamp in logstash

I have the below log file created by the application and the timestamp value is recorded in "@timestamp" but logstash is unable to use that timestamp and failing to push this value to elasticsearch

{
"@timestamp": "2019-08-29T13:02:47.468Z",
"id": "34947135803057280532301987232169408483366183409373609984",
"messageType": "DATA_MESSAGE",
}

i tried to use the date filter to push this value to elasticsearch but logstash is current time value in @timestamp value which is not helping here

    date {
        match => ["@timestamp", "yyyy-MM-dd HH:mm:ss.SSS"]
        timezone => "Europe/Paris"
    }

[logstash.filters.json ] Unrecognized @timestamp value, setting current time to @timestamp, original in _@timestamp field {:value=>""2019-08-30 13:38:25.864""}
[logstash.filters.json ] Unrecognized @timestamp value, setting current time to @timestamp, original in _@timestamp field {:value=>""2019-08-30 13:38:25.851""}

How can i assign the @timestamp value generated by the logs to be pushed in the default timestamp ?

Your problem is that your JSON contains a field called @timestamp that contains a string, and logstash expects @timestamp to be a LogStash::Timestamp, so the json filter stores it in _@timestamp. Just change your date filter to match that

match => ["_@timestamp", "yyyy-MM-dd HH:mm:ss.SSS"]

Hi Badger - Tried the above one but i am still seeing the @timestamp value generated by the application is not able to processed by logstash

Currently the message field is received as a string filed by logstash which contains the timestamp value and used the json filter to extract the values and then i applied the date filter to get the timestamp value

message send to logstash from kinesis

{
"@timestamp": "2019-08-29T13:02:47.468Z",
"id": "34947135803057280532301987232169408483366183409373609984",
"message": "{\"@timestamp\": \"2019-08-29 13:02:47.467\", \"priority\": \"INFO\"}\n",
"@version": "1",
}

Filter applied

    json { 
        source => "message" 
        skip_on_invalid_json => true
        remove_field => [ "message" ] 
        }

   date {
        match => ["_@timestamp", "yyyy-MM-dd HH:mm:ss.SSS"]
    }

also tried with timezone in filter but still getting the same

What do you see in the _@timestamp field when you look at a document in elasticsearch or kibana?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.