Upsert with logstash

Hi,
I am looking to upsert a value to a field in existing document in an array, i have the below config file but the system is throwing error with message shown after config file, can anyone please help.

File:
input {
file {
path => ["C:\ElasticSearch\logstash-2.3.4\bin\testspecialty.csv"]
type => "csv"
start_position => "beginning"
}
}

filter {
csv {
columns => ["Provider_Id","Secondary_Specialty"]
separator => ","
}
}
output {
elasticsearch {
action => "update"
hosts => "localhost:9200"
index => "testsearchresults"
document_id => "%{Provider_Id}"
document_type => "testresults"
upsert => {
"document_id" : "%{provider_id}",
"Secondary_Specialty_String" :["%{Secondary_Specialty}"]
}
manage_template => true
}

stdout { codec => "rubydebug" }
}
Error:
fetched an invalid config {:config=>"\ninput { \n file {\n path => ["C:\ElasticSearch\logstash-2.3.4\bin\testspecialty.csv"]\n type => "csv"\n start_position => "beginning"\n }\n}\n\nfilter { \n csv {\n columns => ["Provider_Id","Secondary_Specialty"]\n separator => ","\n }\n}\noutput {\nelasticsearch {\n action => "update"\n hosts => "localhost:9200"\n index => "testsearchresults"\n document_id => "%{Provider_Id}"\n document_type => "testresults"\n upsert => {\n "document_id" : "%{provider_id}",\n "Secondary_Specialty_String" :["%{Secondary_Specialty}"]\n }\n manage_template => true\n }\n \nstdout { }\n}\n", :reason=>"Expected one of #, => at line 24, column 19 (byte 485) after output {\nelasticsearch {\n action => "update"\n hosts => "localhost:9200"\n index => "testsearchresults"\n document_id => "%{Provider_Id}"\n document_type => "testresults"\n upsert => {\n "document_id" ", :level=>:error, :file=>"/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/agent.rb", :line=>"430", :method=>"create_pipeline"}
starting agent {:level=>:info, :file=>"/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/agent.rb", :line=>"207", :method=>"execute"}
The signal HUP is in use by the JVM and will not work correctly on this platform

According to the docs the upsert option is supposed to be a string, and even if hashes are okay they would have to look like this:

upsert => {
  "document_id" => "%{provider_id}"
  "Secondary_Specialty_String" => ["%{Secondary_Specialty}"]
}

Hi Magnus,

I have made the changes as you said and tried the config file again but it gave me the below error. I tried searching a fix for it but couldn't find any, can you please help

Registering file input {:path=>["C:\ElasticSearch\logstash-2.3.4\bin\testspecialty.csv"], :level=>:info, :file=>"/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-input-file-2.2.5/lib/logstash/inputs/file.rb", :line=>"171", :method=>"register"}
No sincedb_path set, generating one based on the file path {:sincedb_path=>"C:\Users\vamsi/.sincedb_469ac9bb1d0de3017c85ea3629a99535", :path=>["C:\ElasticSearch\logstash-2.3.4\bin\testspecialty.csv"], :level=>:info, :file=>"/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-input-file-2.2.5/lib/logstash/inputs/file.rb", :line=>"216", :method=>"register"}
Invalid setting for elasticsearch output plugin:

output {
elasticsearch {
# This setting must be a string
# Expected string, got {"document_id"=>"%{provider_id}", "Secondary_Specialty_String"=>["%{Secondary_Specialty}"]}
upsert => {"document_id"=>"%{provider_id}", "Secondary_Specialty_String"=>["%{Secondary_Specialty}"]}
...
}
} {:level=>:error, :file=>"/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/config/mixin.rb", :line=>"374", :method=>"validate_check_parameter_values"}

Can you please tell me how did you resolve the above issue ??

Hi,

Am also facing the same issue with the Upsert.
Can you please tell me how did you resolve the above issue ??

Thanks
Pramod

Hello,
The only way I found to fill in the upsert field in elasticsearch output is to use the json_encode filter. This creates a string that I pass in the upsert.

Hope it can help someone...

Why don't you try

elasticsearch {
		hosts => ["localhost:9200"]
		index => "testsearchresults"
		document_id => "%{Provider_Id}"
        doc_as_upsert => true
        action => "update"
        manage_template => true
	}
3 Likes