I got the same confusion these days.
It's not the best way to modify logstash's date filter to change the default timezone of @timestamp which be sent to elasticsearch. Because the @timestamp in UTC format which store in elasticsearch is exactly correct.
I've tried the same method and finally I found out that kibana can transform the @timestamp field into local timezone format, but when you use the scripted field: doc["@timestamp"].date.hourOfDay, it turns out incorrect.
Let's see what happened, every doc in Discover panel of kibana, you will find out that the @timestamp field (June 8th 2017, 09:50:01.000) in Table tab is different from the one in JSON tab ("@timestamp": "2017-06-08T01:50:01.000Z"), that's because kibana has already transform the @timestamp field (JSON tab) into correct local timezone format(Table tab) automatically, and the JSON one is in UTC format, which is also correct (for machine), but the number of hour is not what we want.
So the problem is NOT caused by the @timestamp field, it is caused by the method we used to get the hourOfDay.
Most of us have read the document Lucene Expressions referenced in kibana. It told us to use
doc["@timestamp"].date.hourOfDay
in scripted filed to get the hour of day in this official document, but it did not tell us this method: hourOfDay is calculated based on @timestamp which is in UTC time format (the one in JSON tab). More the worse is that there is NO other method else to change the timezone in Lucene Expressions!
After knowing this truth, I tried to find methods using Painless Expressions. Finally, I got this blog: Using painless kibana scripted fields. According to the article, I used the following expression in scripted field with painless expressions language to figured out the problem:
LocalDateTime.ofInstant(Instant.ofEpochMilli(doc['@timestamp'].value), ZoneId.of('Asia/Shanghai')).getHour()
PS: No date filter needed.
Hopes this will help.