Delete documents in elasticsearch using logstash

Hello there,

I am using elasticsearch 6.4.0 version on a single node windows machine. i am trying to delete documents from elastic search using logstash.

delete.csv:

id
20
21

logstash configuration file:

input {
file {
path => "C:\ELK_Data\delete.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
columns => ["id"]
}
}
output {
elasticsearch{
action => "delete"
hosts => "localhost:9200"
protocol => "transport"
index => "total_expenses_transaction"
document_type => "response" <--- response is document type which i used when i import the data from mysql to elasticsearch using logstash
document_id => "%{id}"
}
}

i am getting these error:

Starting Logstash {"logstash.version"=>"6.4.0"}
[2018-12-03T16:58:21,747][ERROR][logstash.outputs.elasticsearch] Unknown setting 'protocol' for elasticsearch
[2018-12-03T16:58:21,763][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>["C:/ELK/logstash/logstash-core/lib/logstash/config/mixin.rb:86:in config_init'", "C:/ELK/logstash/logstash-core/lib/logstash/outputs/base.rb:60:ininitialize'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:224:in initialize'", "org/logstash/config/ir/compiler/OutputDelegatorExt.java:48:ininitialize'", "org/logstash/config/ir/compiler/OutputDelegatorExt.java:30:in initialize'", "org/logstash/plugins/PluginFactoryExt.java:217:inplugin'", "org/logstash/plugins/PluginFactoryExt.java:166:in plugin'", "C:/ELK/logstash/logstash-core/lib/logstash/pipeline.rb:71:inplugin'", "(eval):35:in <eval>'", "org/jruby/RubyKernel.java:994:ineval'", "C:/ELK/logstash/logstash-core/lib/logstash/pipeline.rb:49:in initialize'", "C:/ELK/logstash/logstash-core/lib/logstash/pipeline.rb:90:ininitialize'", "C:/ELK/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in execute'", "C:/ELK/logstash/logstash-core/lib/logstash/agent.rb:309:inblock in converge_state'"]}

please suggest.
Thanks.

The elastic output plugin doesn't support a "protocol" parameter : https://www.elastic.co/guide/en/logstash/6.4/plugins-outputs-elasticsearch.html

Just remove this line from your logstash configuration?
protocol => "transport"

Sir, Thanks for reply.

I have made dev folder in C drive. and comment the protocol.
#protocol => "transport"
After running the logstash, specified id (which mentioned in delete.csv) is not removed from elasticsearch.
logstash is not terminating after run.
logstash log showing:

[2018-12-04T10:37:32,736][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-12-04T10:37:33,313][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.0"}
[2018-12-04T10:37:39,728][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::Elasticsearch action=>"delete", index=>"monthly_sucess_transaction_new6", id=>"87a0cdd7959a7f5e673d0926183e498c3603db59179c5ceae52c02efb4613847", document_id=>"%{id}", hosts=>[//localhost:9200], document_type=>"response", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_a8cb4110-d395-465a-aad4-7a84d9ae8151", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-12-04T10:37:39,759][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-12-04T10:37:40,384][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2018-12-04T10:37:40,400][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-12-04T10:37:41,595][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-12-04T10:37:41,876][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-12-04T10:37:41,876][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-12-04T10:37:41,907][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["//localhost:9200"]}
[2018-12-04T10:37:41,922][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-12-04T10:37:41,954][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-12-04T10:37:42,455][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x7d364665 sleep>"}
[2018-12-04T10:37:42,502][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-12-04T10:37:42,517][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2018-12-04T10:37:42,985][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Please suggest.
Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.