Given a block of JSON like:
{"":"value1","field2":"value2"}
Where the first key has a blank name "" but has a regular looking value of value1
, you might want to conditionally handle such data to avoid problems once indexing into Elasticsearch where empty field names will fail to parse:
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"field name cannot be an empty string"}}}}}
With a test Logstash pipeline using http input:
input {
http {
port => 8080
}
}
and a test curl to send the test JSON in:
curl --header "Content-Type: application/json" \
--request POST \
--data '{"":"value1","field2":"value2"}' \
http://localhost:8080
if you want to just remove an a key with an empty name at the top level of the JSON, you can remove the field by name, which is ""
, using a Ruby filter.
This is a no-op if no key with a blank name exists so it does not need to use a conditional guard, just remove it if it exists:
ruby {
code => '
event.remove("")
'
}
If you wanted to keep the blank key's value payload rather than discard it, you can copy it to a new key name like blank_key
then remove the original with:
ruby {
code => '
testevent = event.get("")
if testevent
event.remove("")
event.set("blank_key",testevent)
end
'
}