We currently are importing data using Logstash. One of the fields ("request") is a JSON stored as a string. We now require a section of this stored JSON as fields in the searchable index. I have updated Logstash filter using
filter {
json {
source => "request"
target => "[@metadata][request_json]"
}
if [@metadata][request_json][merchant] {
# in the Request, pull out the Merchant-ID
mutate {
add_field => {
"merchant_id" => "%{[@metadata][request_json][merchant][id]}"
"merchant_name" => "%{[@metadata][request_json][merchant][name]}"
}
}
}
}
Which works great for new data.
How can I update the indices for the historic data? I'm using Elasticsearch, Logstash and Kibana 8.5.3
Thx for the advice. So, I'm looking at this, but struggling how to implement the "painless" script for this, since painless has limited support for JSON. I've started off OK with
[
{
"json": {
"field": "request",
"target_field": "request_json"
}
},
"script": {
"inline": "def now_what;",
"lang": "painless"
},
{
"remove": {
"field": "request_json"
}
}
]
But the bit in the middle is where I'm struggling. Never did "painless" and not much help out there in Google.
Why not use a number of set processors instead of painless?
Because YOU are magic! LOL, thank you so so much!
[
{
"json": {
"field": "request",
"target_field": "request_json"
}
},
{
"set": {
"field": "merchant_name",
"copy_from": "request_json.merchant.name",
"ignore_empty_value": true
}
},
{
"set": {
"field": "merchant_name",
"copy_from": "request_json.data.application.merchant.name",
"override": false,
"ignore_empty_value": true
}
},
{
"set": {
"field": "merchant_id",
"copy_from": "request_json.merchant.id",
"ignore_empty_value": true
}
},
{
"set": {
"field": "merchant_id",
"copy_from": "request_json.data.application.merchant.id",
"override": false,
"ignore_empty_value": true
}
},
{
"remove": {
"field": "request_json"
}
}
]