Logstash manipulate json object before indexing to ES

A Kafka input plugin is getting messages in json format and being parsed by the json filter. One of the root keys -- transaction_details -- can contain a very large number of fields that we don't want to flatten into discreet fields. Would like to be able to convert the transaction_details root key from an object to a string that can be mapped as a text field and thus searchable as text but not selectable as a field in Kibana. We want to keep all the other json keys as they are when parsed. I can't seem to find a way (other than some kind of regular expression that seems very complicated and probably above my pay grade) to do this.

Sharing any thoughts would be most appreciated.


You could encode that as JSON again after parsing it. Or maybe mapping it as "flattened" in ES would make sense?

ruby {
    init => "require 'json'"
    code => "event.set('transaction_details', JSON.generate(event.get('transaction_details')))"
1 Like

Can you paste some samples data from kafka input ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.