Data of logstash isn't preserved by elastisearch

Hi all

I'm using
・filebeat-1.2.3-1.x86_64(centos6.4)
・logstash-2.3.2-1.noarch.rpm(amazon linux)
・elasticsearch-2.3.5-1.noarch(amazon linux)
・kibana-4.5.4-1.x86_64(amazon linux)

Access log of nginx, it's set as LTSV and it's being sent to logstash by filebeat.
And LTSV sent from filebeat, parse is being done and it's being sent to elasticach by the following setting, data of elasticache isn't preserved.

Are there any problems with setting for this?

【logstash setting】
input {
beats {
port => 5044
}
}

filter {
kv {
field_split => "\t"
value_split => ":"
}
date {
match => [time, "'['dd/MMM/YYYY:HH:mm:ss Z']'"]
locale => us
}
useragent {
source => ua
prefix => "ua."
}
mutate {
convert => {
status => integer
reqtime => integer
size => integer
}
}
}

#output {

stdout {

codec => rubydebug

}

#}
output {
elasticsearch {
hosts => "host-ip-address:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

What is your problem?

nginx(access-log format ltsv)filebeat => logstash(parse)=> elasticsearch

But it isn't possible to register with elasticsearch with data.

How can data be registered now with elasticsearch?