Remove duplicates in a logstash array

I have a json String with name / value pairs

{
"A1":"123",
"A2":"225",
"A3":"668",
"A4":"225"
}

Using logstash I can mutate the string and end up with an array by using
mutate {
add_field => {A => "%{A1}")
add_field => {A => "%{A2}")
add_field => {A => "%{A3}")
add_field => {A => "%{A4}")
}

mutate {
remove_field => ["A1", "A2", "A3", "A4"]
}

{
"A": ["123", "225", "668", "225"]
}

I would like to eliminate the duplicates in the array so that I end up with
{
"A": ["123", "225", "668"]
}

I have tried using kv {} with allow_duplicate_values=false
but that would only work if my input data was of the format
{
"A":"123",
"A":"225",
"A":"668",
"A":"225"
}

Looking at this post

it would appear to be some variation on
if [A2] in [A] {
if %{A2} in [A] {

Would appreciate people's help with this.
Thanks

ruby {
  code => 'event.set("A", event.get("A").uniq)'
}
2 Likes

Thanks Jenni.

Even quicker using the built in plug-ins

if [A1] {
mutate {
add_field => {"A" => "%{A1}"}
}
}
if [A2] {
if !([A]) or !([A2] in [A]) {
mutate {
add_field => {"A" => "%{A2}"}
}
}
}
if [A3] {
if !([A]) or !([A3] in [A]) {
mutate {
add_field => {"A" => "%{A3}"}
}
}
}
if [A4] {
if !([A]) or !([A4] in [A]) {
mutate {
add_field => {"A" => "%{A4}"}
}
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.