Elasticsearch output Document missing exception 404

Hi everyone! I am facing a problem with logstash and elasticsearch output plugin. Basically i want to index couchdb changes. Using this config:

[CODE]
input {
couchdb_changes {
db => "database"
host => "localhost"
port => 5984
username => "username"
password => "password"
#initial_sequence => 0 #this is only required for the an initial indexing
}
}
filter {
mutate {
add_field => { "action" => "%{[@metadata][action]}" }
}
if [action] == 'delete' {
elasticsearch {
hosts => ["localhost:9200"]
query => "_id:%{[@metadata][_id]}"
fields => ["type", "$doctype"]
sort => ""
}
} else {
mutate {
add_field => { "type" => "%{[doc][$doctype]}" } #yes, my docs have a $doctype field to store the type

    }
  }
}
output {
  elasticsearch { 
	
    	action => "%{[@metadata][action]}"
	        doc_as_upsert => true
	        document_id => "%{[@metadata][_id]}"
	        #document_type => "%{[@metadata][$doctype]}" # it wont work, why?
	        hosts => ["localhost:9200"]
    	index => "my_index"
	
}  
  stdout { codec => rubydebug } #enable this option for debugging purpose
}

[/CODE]

Trying to create or update something in couchdb will always give me and "update" action, and the final response from logstash:

←[33mFailed action. {:status=>404, :action=>["update".... ....=>404, "error"=>{"type"=>"document_missing_exception", "reason"=>"[my_document_type][my_document_id]: document missing", "shard"=>"-1", "inde x"=>"my_index"}}}, :level=>:warn}←[0m
It seems that its not able to create the index if it comes from an updated couchdb wich has not been previously stored in the elasticsearch index. If I prevoiusly manually create the index for the document, it works. I dont know if couchdb_changes uri common behaviour is to throw an "update" action even if it is creating data. Manually configuring action=>"create" in elasticsearch output, throws a bunch of errors, starting from "not been able to reach elasticsearch at localhost:9200".
Using last 2.2.0 for elastic and 2.2.2 for logstash, but facing the same problem with previous versions.

I know im missing something (too many things indeed). Only want to do complex searchs for couchdb data through elasticsearch. Dont know if i need ELK or only with logstash and elasticsearch is sufficient. Thanks in advance!

This could be something else. The couchdb_changes plugin is supposed to mark "doc_as_upsert", so that even though all actions are updates, Elasticsearch should interpret an update on a non-existing doc as an insert—hence, "upsert".

Thanks for replying. I assume that with default initial installation of all the products involved (couchdb, elasticsearch, logstash), without any modified configuration, it has to work. After creating/updating a couchdb document, the whole reponse of logstash, is:

←[33mFailed action. {:status=>404, :action=>["update", {:_id=>"7772e161f5b0e2b1 abced1679f00045c", :_index=>"kmfile", :_type=>"kmFile", :_routing=>nil}, #<LogSt ash::Event:0xd847ac @metadata_accessors=#<LogStash::Util::Accessors:0x6a28cc @st ore={"_id"=>"7772e161f5b0e2b1abced1679f00045c", "action"=>"update", "seq"=>106}, @lut={"[action]"=>[{"_id"=>"7772e161f5b0e2b1abced1679f00045c", "action"=>"updat e", "seq"=>106}, "action"], "[_id]"=>[{"_id"=>"7772e161f5b0e2b1abced1679f00045c" , "action"=>"update", "seq"=>106}, "_id"]}>, @cancelled=false, @data={"doc"=>{"$ doctype"=>"kmFile"}, "doc_as_upsert"=>true, "@version"=>"1", "@timestamp"=>"2016 -02-26T15:37:06.311Z", "action"=>"update", "type"=>"kmFile"}, @metadata={"_id"=> "7772e161f5b0e2b1abced1679f00045c", "action"=>"update", "seq"=>106}, @accessors= #<LogStash::Util::Accessors:0x291625 @store={"doc"=>{"$doctype"=>"kmFile"}, "doc _as_upsert"=>true, "@version"=>"1", "@timestamp"=>"2016-02-26T15:37:06.311Z", "a ction"=>"update", "type"=>"kmFile"}, @lut={"action"=>[{"doc"=>{"$doctype"=>"kmFi le"}, "doc_as_upsert"=>true, "@version"=>"1", "@timestamp"=>"2016-02-26T15:37:06 .311Z", "action"=>"update", "type"=>"kmFile"}, "action"], "[action]"=>[{"doc"=>{ "$doctype"=>"kmFile"}, "doc_as_upsert"=>true, "@version"=>"1", "@timestamp"=>"20 16-02-26T15:37:06.311Z", "action"=>"update", "type"=>"kmFile"}, "action"], "[doc ][$doctype]"=>[{"$doctype"=>"kmFile"}, "$doctype"], "type"=>[{"doc"=>{"$doctype" =>"kmFile"}, "doc_as_upsert"=>true, "@version"=>"1", "@timestamp"=>"2016-02-26T1 5:37:06.311Z", "action"=>"update", "type"=>"kmFile"}, "type"]}>>], :response=>{" update"=>{"_index"=>"kmfile", "_type"=>"kmFile", "_id"=>"7772e161f5b0e2b1abced16 79f00045c", "status"=>404, "error"=>{"type"=>"document_missing_exception", "reas on"=>"[kmFile][7772e161f5b0e2b1abced1679f00045c]: document missing", "shard"=>"- 1", "index"=>"kmfile"}}}, :level=>:warn}←[0m
(kmFile is my document type, stored in $doctype field) Elasticsearch log does not throw any error.

Simple logstash configuration, with elasticsearch output, works ok:

[CODE]
input { stdin { } }

filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
[/CODE]

I'm puzzled by $doctype. Is that the only thing you're trying to upload to Elasticsearch? Is it a variable?

So Elasticsearch is complaining because no document with _id 7772e161f5b0e2b1abced1679f00045c exists. This implies that doc_as_upsert isn't working properly. Please find the doc_as_upsert issues at GitHub - logstash-plugins/logstash-output-elasticsearch and add these details there. It may be closed, but if so, I think it may have been closed in error.

Thanks. $doctype with the '$' was the first document type field name that i used when i created the database, and i have not changed it since then. I have changed it to "type" or other name with no "$", but no luck. I am not trying to upload only that data, it was only for testing, what i am trying to upload has more data. Is for a document knowledge management server application, using couchdb to store file paths, keywords, categories and a "metadata" array for those files. I need to do complex searchs within that "metadata" array field.

I will continue looking for doc_as_upsert problems. Thank u.

Is it possible that you didn't create an index before you tried to insert the documents into elasticsearch? Consider the following:

#!/bin/bash
# Example shows how to add a new index

curl -u es_admin \
-H "Accept: application/json" \
-H "Content-Type:application/json" \
-XPUT 'http://localhost:9200/kmfile/' -d '{
    "settings" : {
        "index" : {
            "number_of_shards" : 4,
            "number_of_replicas" : 2
        }
    }
}'

in this case you would need to add all possible indexes to elasticsearch in advance of starting logstash