Elastic search look up filter is not working when the fields has special characters

Hi Team,

I am trying to compare data in two indexes using a common key which has special character "/" in the value.

For example: Common_key: pr/01/1235678

When i use Elasticsearch filter in logstash to look up using common_key i am getting elastic look up failure. Could you please let me know how can i look up using common_key filed as all the values in the common_key will be in the same format as mentioned above.

My logstash pipeline:
input {
elasticsearch {
cloud_id => "xxxxxxxxxxxxxxxxxxxx"
proxy => "xxxxxxxxxxxxxxx"
index => "index1"
query => '{"query": {"match_phrase": {"name": "amazon"}}}'
ssl => true
user => "xxxxxxxxxx"
password => "xxxxxxxxxxxxx"
}
}

filter {
elasticsearch {
cloud_id => "xxxxxxxxxxxxxxx"
proxy => "xxxxxxxxxxxxxxxx"
index => "index2"
query => "common_key:%{[common_key]}"
ssl => true
user => "xxxxxxxxxxxxx"
password => "xxxxxxxxxxxxxxxxxx"
fields => {
"[source]" => "[source]"
"[price]" => "[price]"
"[shop]" => "[shop]"
}
}
}

output {
stdout {
codec => rubydebug
}
elasticsearch {
cloud_id => "xxxxxxxxxxxx"
proxy => "xxxxxxxxxxxx"
index => "index3"
ssl => true
user => "xxxxxxxxxxxx"
password => "xxxxxxxxxxx"
action => "create"
}
}

Thankyou in advance.

Can someone please help me here

You might have to escape the / before passing it to filter. Can you post the complete error message.

Hi @Hemanth_Gowda,

I am not getting any error but after processing i could see the tags having value of "elastic look up failure".

If i pass the field common key from source with escape characters will it solvey problem like buy using gsub.

Thankyou in advance.