Issue with grok and timestamp field in apache logs

I've been going through this example:

https://www.elastic.co/guide/en/logstash/5.0/advanced-pipeline.html

And I used the whole config from the example (beside removing the geoip data)

input {
file {
path => "/home/trueal/elk/support_scripts/*.log"
start_position => beginning
ignore_older => 0
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
}
output {
elasticsearch {
hosts => "localhost:9200"
}
}

While importing the data to elasticsearch works, the created mapping for the field timestamp looks like this:

               "timestamp": {
                    "type": "text",
                    "norms": false,
                    "fields": {
                        "raw": {
                            "type": "keyword",
                            "ignore_above": 256
                        }
                    }
                },

I would expect timestamp to be of type date instead of text to be able to use that field in kibana later on.
I'm not sure if this is a bug in grok or just working as intended.

Ok, I quickly set up an "old" elk stack and it behaves the same, e.g. mapping timestamp to string.
Still not very helpful, but it seems to be in grok then?

The grok filter will parse the timestamp as a string. You should use the date filter to convert this into a date, which by default will be stored in the @timestamp field, which in the default logstash mapping is mapped as a date.