Set and persist elasticsearch index mappings for couchdb_changes input from logstash

I am currently populating an elasticsearch index from logstash using the couchdb_changes input plugin with the following configuration.
input {
couchdb_changes {
type => "meet"
db => "my_db"
host => "127.0.0.1"
port => 5984
initial_sequence => 0
keep_revision => true
}
}

filter {
if [@metadata][action] == "delete" {
mutate {
add_field => {"elastic_action" => "delete"}
}
} else {
mutate {
add_field => {"elastic_action" => "index"}
}
}
}

output {
elasticsearch {
index => "my-index"
document_id => "%{[@metadata][_id]}"
action => "%{elastic_action}"
}
}
I understand that the event from logstash is of the form {doc:{}, doc_as_upsert:{}, @timestamp, @version}. I have configured mappings in my-index according to the field structure of my couchdb documents more especially to cater for arrays of objects by setting the type to nested. When I run logstash, however, the mappings in my-index seem to be overwritten and the "type": "nested" setting completely removed from the fields that I would like to set to this setting. As such I can query the index the way I would like. Any pointers on what I am doing wrong would be much appreciated.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.