Hello ,
I got following exception while index data into Elasticsearch using logstash
Could not index event to elasticsearch
error"=>{"type"=>"strict_dynamic_mapping_exception", "reason"=>"mapping set to strict, dynamic introduction of [caseid] within [globalsearch_comments] is not allowed"}}}}
i think logstash trying to match the response json with index properties, but logstash response is always lowercase, Is there any way to configure logstash which returns json keys same as index properties ??
it is working fine if i use lowercase json keys for properties ,while creating the index.
Following is my json structure to create the index.
{
"mappings":{
"globalsearch_comments":{
"dynamic":"strict",
"properties":{
"CommentID":{
"type":"integer"
} ,
"CaseID":{
"type":"integer"
} }
}
}
}
Logstash conf file :
input {
jdbc {
jdbc_connection_string => "jdbc:sqlserver://XXXXXXXX;databaseName=XXXXXX"
jdbc_user => "XXX"
jdbc_password => "XXXXX$"
jdbc_driver_library => "D:\sqljdbc4-4.0.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
statement => "SELECT * FROM dbo.Comments"
}
}
filter {
mutate {
remove_field => ["@version", "@timestamp"]
}
}
output {
stdout { codec => rubydebug}
elasticsearch {
"hosts" => "localhost:9200"
"index" => "globalsearch_comments"
"document_type" => "globalsearch_comments"
"document_id" => "%{commentid}"
}
}