Logstash to delete document from Elasticsearch

I am using Logstash version 1.5.1 to parse a log file and create or update documents on Elasticsearch. In one of the cases, I am trying to delete an existing document from elasticsearch using logstash configuration file. But I get the following exception infinitely:

Got error to send bulk of actions: [500] {"error":"IllegalArgumentException[Malformed action/metadata line [2], expected START_OBJECT or END_OBJECT but found [VALUE_STRING]]","status":500} {:level=>:error}
Failed to flush outgoing items {:outgoing_count=>9, :exception=>#<Elasticsearch::Transport::Transport::Errors::InternalServerError: [500] {"error":"IllegalArgumentException[Malformed action/metadata line [2], expected START_OBJECT or END_OBJECT but found [VALUE_STRING]]","status":500}>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.12/lib/elasticsearch/transport/transport/base.rb:135:in `__raise_transport_error'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.12/lib/elasticsearch/transport/transport/base.rb:227:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.12/lib/elasticsearch/transport/transport/http/manticore.rb:54:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.12/lib/elasticsearch/transport/client.rb:119:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.12/lib/elasticsearch/api/actions/bulk.rb:80:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch/protocol.rb:103:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:466:in `submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:490:in `flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:219:in `buffer_flush'", "org/jruby/RubyHash.java:1341:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:216:in `buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:193:in `buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:112:in `buffer_initialize'", "org/jruby/RubyKernel.java:1511:in `loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:110:in `buffer_initialize'"], :level=>:warn}
^CSIGINT received. Terminating immediately.. {:level=>:fatal}

Here is my logstash output configuration:

output {
        if !("_grokparsefailure" in [tags]) {
                if [type] == "DGL" {
                        if [facility_id] != "MDN" {
                                elasticsearch {
                                        document_id => "%{messageID}"
                                        template_overwrite => true
                                        template_name => syndromic
                                        template => "/opt/logstash/template.json"
                                        protocol => "http"
                                        codec => "plain"
                                        manage_template => true
                                        host => "orion-mao-devehsreporting1v"
                                        index => "syndromic-%{+YYYY.MM}"
                                }
                        } else {
                                elasticsearch {
                                        document_id => "%{messageID}"
                                        protocol => "http"
                                        host => "orion-mao-devehsreporting1v"
                                        index => "syndromic-%{+YYYY.MM}"
                                        action => "delete"
                                }
                        }
                }
                if [type] == "CAL" {
                        elasticsearch {
                                template_overwrite => true
                                template_name => syndromic
                                template => "/opt/logstash/template.json"
                                protocol => "http"
                                codec => "plain"
                                manage_template => true
                                host => "orion-mao-devehsreporting1v"
                                index => "syndromic-%{+YYYY.MM}"
                        }
                }

        }
        else {
                stdout { codec => rubydebug }
        }
}

The part of code that is not working is where action => "delete"
Please help me out here.

Thanks.

Is there anything corresponding in the ES logs?

No, there is no activity on the ES logs.. I constantly see this error when I run logstash again after deleting the .since_db files.. and running logstash again to parse the files..

Got error to send bulk of actions: [500] {"error":"IllegalArgumentException[Malformed action/metadata line [2], expected START_OBJECT or END_OBJECT but found [VALUE_STRING]]","status":500} {:level=>:error}
Failed to flush outgoing items {:outgoing_count=>81, :exception=>"Elasticsearch::Transport::Transport::Errors::InternalServerError", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.12/lib/elasticsearch/transport/transport/base.rb:135:in __raise_transport_error'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.12/lib/elasticsearch/transport/transport/base.rb:227:inperform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.12/lib/elasticsearch/transport/transport/http/manticore.rb:54:in perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.12/lib/elasticsearch/transport/client.rb:119:inperform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.12/lib/elasticsearch/api/actions/bulk.rb:80:in bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch/protocol.rb:104:inbulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch.rb:542:in submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.7-java/lib/logstash/outputs/elasticsearch.rb:566:inflush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:219:in buffer_flush'", "org/jruby/RubyHash.java:1341:ineach'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:216:in buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:193:inbuffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:112:in buffer_initialize'", "org/jruby/RubyKernel.java:1511:inloop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:110:in `buffer_initialize'"], :level=>:warn}

This started happening when I added "delete" in my configuration. Please let me know the best way to delete a record from ES using logstash config.

Thanks.

Please read this - seems to be related CouchDB plugin set dynamic type for Elasticsearch

Thank you for your reply. I am using a similar configuration and it still throws this exception constantly until I kill logstash.

Is this a bug in Logstash?

Notice this:
expected START_OBJECT or END_OBJECT
But, I am not having any object and there is no option in Elasticsearch output to send source as null.

Please help me here.
Thanks,
Umang

The problem is actually caused by logstash-output-elasticsearch plugin. In 'http' mode it does send object together with 'delete' command which ES does not expect. You will have to wait until https://github.com/logstash-plugins/logstash-output-elasticsearch/issues/195 is resolved or patch the protocol.rb file yourself as written in https://github.com/logstash-plugins/logstash-output-elasticsearch/issues/195#issuecomment-119745469

Ok... great.. looks promising. I'll try it and will post here the results..

Thanks again.