Issue while parsing xml data from a file

I have xml data in a file and I am trying to parse it as a single event and visualise it in kibana
below is the Configuration file for logstash.
input {
file {
path => "/Users/varsaraf/outxr.xml"
start_position => "beginning"
sincedb_path => "/tmp/sincedbfile"
exclude => "*.gz"
type => "xml"
codec => multiline {
pattern => "<operation-data"
negate => "true"
what => "previous"
}
}
}

filter {
    xml {
        source => "message"
        store_xml => false
        target => "doc"
        xpath => [
            "/operation-data/operations/operation/operation-id/text()", "oper-id",
            "/operation-data/operations/operation/operation-time/text()", "oper-time",
            "/operation-data/operations/operation/statistics/specific-stats/icmp-path-jitter-stats/source-address/text()", "source-ip",
            "/operation-data/operations/operation/statistics/specific-stats/icmp-path-jitter-stats/dest-address/text()", "dest-ip",
            "/operation-data/operations/operation/statistics/specific-stats/icmp-path-jitter-stats/hop-address/text()", "hop-id",
            "/operation-data/operations/operation/statistics/specific-stats/icmp-path-jitter-stats/packet-interval/text()", "pkt-interval",
            "/operation-data/operations/operation/statistics/specific-stats/icmp-path-jitter-stats/response-time-count/text()", "RTC",
            "/operation-data/operations/operation/statistics/specific-stats/icmp-path-jitter-stats/packet-count/text()", "pkts",
            "/operation-data/operations/operation/statistics/specific-stats/icmp-path-jitter-stats/packet-loss-count/text()", "pkt-loss"
        ]
    }
}

output {
    elasticsearch {
        codec => json
        hosts => ["localhost:9200"]
        index => "test1"
    }
    stdout {
        codec => rubydebug
    }

I am able to create an index but the fields have data type as string I want them to be long or integer. how do I do it?

You could use mutate+convert. You will need to create a new index to see the type change.

I get the following output on running the new config file with the corrections of adding mutate and convert. the output is attached as Screenshot-

my config file is -

    input {
       file {
    path => "/Users/varsaraf/out.xml"        start_position => "beginning"
    sincedb_path => "/tmp/mysincedbfilez"
    exclude => "*.gz"
    type => "xml"
    codec => multiline {
        pattern => "<sla-path-jitter-stats>"
        negate => "true"
        what => "previous"
    }
}} filter {
xml {
    source => "message"
    store_xml => false
    target => "sla-path-jitter-stats"
    xpath => [
        "/sla-path-jitter-stats/oper-id/text()", "id",
        "/sla-path-jitter-stats/hop-address/text()", "ip_address",
        "/sla-path-jitter-stats/latest-start-time/text()", "start-time",
        "/sla-path-jitter-stats/latest-rtt-stats/sum-of-rtt/text()", "rtt-sum",
        "/sla-path-jitter-stats/latest-packet-loss-stats/timeouts/text()", "timeout",
        "/sla-path-jitter-stats/latest-packet-loss-stats/packet-loss-count/text()", "packet-loss"
    ]} mutate {convert => {"id" => "long"  "rtt-sum" => "long"  "timeout" => "long"   "packet-loss"=>"long"

} }}
output {
elasticsearch {
codec => json
hosts => ["localhost:9200"]
index => "ipslaxe"
}
stdout {
codec => rubydebug
}}

it doesn't create an index or send the data to Elasticsearch. while previously the same file without mutate and convert was sending the data.

"long" is not a supported type. Read the documentation.