How to use parameters in a stored script that's executed using logstash output plugin

Hi, I've a stored script :

POST _scripts/1_jd_job_domain
  "script": {
	"lang": "painless",
	"params": {
	  "generic_domain": "generic domain"
	"source" : """if ( != null && != null)  
                     { Map map_generic_details = new HashMap()
		     ; ArrayList list_domain_list = new ArrayList()
		     ; map_generic_details.put('index type' , params.generic_domain)

I call this script from logstash output plugin. It works fine for the rest except it doesn't retrieve the values from the params. What am I doing wrong ?

I've also tried defining those params in the logstsh output part like this :
parameters => {"generic_domain" => "generic_domain"}

but then I've the error:
Response code '400' contacting Elasticsearch at URL 'http://localhost:9200/?generic_domain=generic_domain'"}

What am I doing wrong? I would prefer a solution where I could get the parameters values from the stored script if possible. Thank you

On the logstash output I've tried :

parameters => { "event" => {"taxless_total_price" => "90"}}
But now it shows the error : Pipeline aborted due to error {:pipeline_id=>"main", :exception=> Illegal character in query at index 54: http://elastic:secret4hrsearch@localhost:9200/?event=["taxless_total_price", "90"]

could you help please, tanks.

I think the parameters option should just be { "key1" => "value1" "key2" => "value2" }. Otherwise the flat_map call will create an array which needs to be encoded.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.