How to create mapping fields using logstash

Iam try to synch mysql data with elasticsearch using logstash . But I can't define field type as "keyword" and asign normalizer in logstash.How I solve this issue.

I have to perform somany wildcard query,that's why iam using keyword as field type

input {

jdbc {
jdbc_driver_library => "/home/arun/Downloads/mysql-connector-java-5.1.42.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "databaseconnection"
jdbc_user => "resuemb"
jdbc_password => "JlZTDCte^0F~F1A"
#schedule => "*/1 * * * *"
jdbc_paging_enabled => "true"
jdbc_page_size => "20000"
jdbc_fetch_size => "50000"
statement => "SELECT pf.id,pf.master_record_id,pf.id as recordid,last_viewed_date_time, pf.project_name as name, pf.app_id AS appid, pf.owner_id FROM project_fields pf WHERE IFNULL(pf.isdeleted_recycle_bin,0) = 0 and pf.app_id = 142 limit 50000 "
type=> "project"
}
}
filter {
mutate {
convert => {
"name" => "keyword"
}
}
}
output {
elasticsearch {
index => "%{appid}-%{type}"
document_id => "%{master_record_id}"
hosts => ["127.0.0.1:9200"]
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.