Logstash with Elasticsearch: Update the field which involves json object on the script section

Initially, i have inserted the below data into elasticsearch
{"id":1234,"name":"Siva","City":"Chennai"}

Currently, I am trying to update the above record by adding additional columns,
{"phoneType":"Personal","PhoneNumber":"9976997620"}

but, I couldnt get the json value on the elasticsearch script section as the field "phone" is considered as json object and not String.
Can anyone suggest that, how to retrieve the json field "phone" inside the script section of the elasticsearch output

input {
kafka{
topics => "topic2"
bootstrap_servers => "localhost:9092"
}
}
filter {
json {
source => "message"
target => "phone"
}
}
output
{
elasticsearch {
hosts => ["http://localhost:9200"]
index => "test_index"
document_type => "test_type1"
document_id => "1234"
action => "update"
scripted_upsert => "true"
script_lang => "painless"
script_type => "inline"
script => "
if(ctx._source.phoneData == null) ctx._source.phoneData = [params.event.get('phone')];
else if(ctx._source.phoneData != null) ctx._source.phoneData.add(params.event.get('phone'));
"
stdout{
codec => rubydebug
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.