How to transform JDBC input into a specific JSON format


New to Logstash and trying to backfill a new Elasticsearch instance from a MySQL table using the JDBC input plugin. I have the input plugin working, but right now the output is just going to stdout.

I want to transform the input from the JDBC plugin to match a specific JSON format. For example, the query used in my JDBC input plugin will return:

"status_name" => "SENT",
"order_id" => 123,
"addr_line_1" => "123 Fake St"

I want the resulting JSON to look like:
{ "recordHeader" : {"status": "SENT", "orderId": 123, "addressLine1": "123 Fake St"} }

This involves renaming the column names returned from the JDBC input plugin and then putting the values in this new format. Is this something I can achieve using the mutate and add field transform plugins? Do I have to add a field that matches my new format and then remove the old field?

You could certainly do it with mutate. Personally I might use ruby

    ruby {
        code => '
            [ "addr_line_1", "order_id", "status_name" ].each { |k|
                event.set("[recordHeader][#{k}]", event.get(k))

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.