Hi,
I have an elasticsearch index entry like;
{
"_index": "myindex-2017-01-25",
"_type": "logs",
"_id": "AVnVKXGK2fWIWeetV6jl",
"_score": null,
"_source": {
"severity": "SEVERE",
"logtype": "mylog",
"@timestamp": "2017-01-25T10:23:40.949Z",
"@version": "1",
"methodName": "getNames",
"className": "com.school.employee.teacher",
"message": "##SEVERE 25-Jan-2017 10:23:40.949 com.school.employee.teacher getNames user_not_found##",
"error": " user_not_found##",
"timestamp": "25-Jan-2017 10:23:40.949"
},
"fields": {
"@timestamp": [
1485339820949
]
},
"sort": [
1485339820949
]
}
The timestamp in log file is 25-Jan-2017 10:23:40.949
. I am retrieving this from kafka using Logstash kafka input and i use date filter like;
filter {
grok {
match => { "message" => "^##(?<severity>(SEVERE|INFO|WARN)) (?<timestamp>%{MONTHDAY}-%{MONTH}-%{YEAR} %{TIME}) %{NOTSPACE:className} %{NOTSPACE:methodName} %{GREEDYDATA:error}" }
}
date {
match => [ "timestamp", "dd-MMM-yyyy HH:mm:ss.SSS" ]
timezone => "UTC"
target => "@timestamp"
}
}
@timestamp
and timestamp
fields in JSON are correct, but when I look at Kibana's @timestamp field, it is January 25th 2017, 15:53:40.949
, which is not correct. Below are the images for reference;
How can I fix this?
Thanks in advance.