Predefined elasticsearch schema, not mapped with logstash data upload from mysql table column

I'm using logstash to upload data from mysql to elasticsearch.

I've created a 'index' named testindex as follows

POST /insights{
"settings" : {
"number_of_shards" : 1
},
"mappings" : {
"type1" : {
"_source" : { "enabled" : false },
"properties" : {
"callref":{"type" : "string", "index" : "not_analyzed","null_value" : "NA"},
"category":{"type" : "string", "index" : "not_analyzed","null_value" : "NA"},
"customer":{"type" : "string", "index" : "not_analyzed","null_value" : "NA"},
"description":{"type" : "string", "index" : "not_analyzed","null_value" : "NA"},
"closedby":{"type" : "string", "index" : "not_analyzed","null_value" : "NA"},
"fixedby":{"type" : "string", "index" : "not_analyzed","null_value" : "NA"},
"category1":{"type" : "string", "index" : "not_analyzed","null_value" : "NA"},
"comments":{"type" : "string", "index" : "not_analyzed","null_value" : "NA"},
"location":{"type" : "geo_point"}
}
}
}
}

using following conf file in logstash

input {
jdbc {
jdbc_driver_library => "D:\Users\admin\setups\elasticsearch-2.1.0\plugins\mysql-connector-java-5.1.37-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/testdatabase"
jdbc_user => "root"
jdbc_password => "root"
statement => "SELECT * from fma_insights_data"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
action => "index"
index => "testindex"
}
stdout { codec => rubydebug }
}

If I don't create schema first and upload from logstash then it creates "testindex" index with corresponding data.

But if I create create schema first and then upload, then index "textindex" is there but with no data.

Suggest appropriate solution.

Thanks.