Haproxy data, every field is a string

I am using the latest ELK versions and the built in pattern in logstash for haproxy data. e.g.

input {
udp {
type => "haproxy"
port => "3513"
}
}

#haproxy http
filter {
if [type] == "haproxy" {
grok {
match =>
{ "message" => "%{HAPROXYHTTPBASE}"}
}
}
}
#Output to local elasticsearch
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}

With this all the fields are being matched and looking at the logstash patterns I can see many fields look to correctly be set as int or ip https://github.com/logstash-plugins/logstash-patterns-core/blob/master/patterns/haproxy

Despite this all the fields look to be strings and I get errors trying to visualize them as numbers: Visualize: Expected numeric type on field [srvconn], but got [string]

Does anyone have any idea why they are being stored in elasticsearch as strings despite logstash being told in the pattern that they are numbers? Any help would be greatly appreciated.

Does anyone have any idea why they are being stored in elasticsearch as strings despite logstash being told in the pattern that they are numbers?

So a section of the pattern looks like this:

%{INT:actconn}/%{INT:feconn}/%{INT:beconn}/%{INT:srvconn}/%{NOTSPACE:retries}

This does not mean that actconn et al will be integer fields, just that they'll be parsed as integers. To actually produce integer fields this would've had to look like this:

%{INT:actconn:int}/%{INT:feconn:int}/%{INT:beconn:int}/%{INT:srvconn:int}/%{NOTSPACE:retries}

Alternatively, use the mutate filter's convert option to change the data type after the fact.

Thanks for that. I have created a seperate filter/grok rather than using the inbuilt one like this:

match => [
"message", "%{IP:client_ip}:%{NUMBER:client_port:int} [%{NOTSPACE:haproxy_timestamp}] %{NOTSPACE:frontend_name} %{NOTSPACE:backend_name}/%{NOTSPACE:server_name} %{NUMBER:time_queue:int}/%{NUMBER:time_backend_connect:int}/%{NUMBER:time_duration:int} %{NUMBER:bytes_read:int} %{NOTSPACE:termination_state} %{NUMBER:actconn:int}/%{NUMBER:feconn:int}/%{NUMBER:beconn:int}/%{NUMBER:srvconn:int}/%{NUMBER:retries:int} %{NUMBER:srv_queue:int}/%{NUMBER:backend_queue:int}" ,

I then deleted all my logstash indexes so the data could be stored as integers. For most fields it has worked fine, but for a couple like {NUMBER:backend_queue:int}" it is still storing them as strings!

I am going to try deleting everything again, including the kibana index but not sure why it was worked for 90% of the fields and failed for 10%. Anyway, thanks for your help so far!