I suspect you need to add the timezone setting as part of your date filter, otherwise it is making an assumption.
Also please keep in mind all dates in elasticsearch are stored as UTC. If you look at them in Kibana Apps they will be displayed in your local timezone, if you display them directly via curl / API they will show in UTC
The code below will not extract the time out of your message field, you will need to parse the message first with a grok filter and then pass the field with the timestamp into it the date filter. So what is mostly likely happening that date filter is completely failing there is probably a tag in the document in elastic something like _dateparsefailure as so since it is failing it is just inserting the "now" time.
This wont work... because you are passing in a whole message not just the timestamp field
filter {
date {
match => ["message", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-dd HH:mm:ss", "yyyy-MM-dd HH:mm:ss.SSS", "ISO8601"]
timezone => "Europe/Moscow"
}
}
So your filter should look something like
filter {
grok {
match => {
"message" => [
# parse the message <!---- THESE ARE JUST EXAMPLES there can me multiple
"%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} \[%{DATA:class}\]:%{GREEDYDATA:message}",
"%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} \[%{DATA:class}\]:%{GREEDYDATA:message}"
]
}
}
# Now you can parse the timestamp
date {
match => ["timestamp", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-dd HH:mm:ss", "yyyy-MM-dd HH:mm:ss.SSS", "ISO8601"]
timezone => "Europe/Moscow"
}
}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.