Input / filter plugin for Google bigquery

I am running a logstash instance which is supposed to receive data from Google BigQuery. The issue I am facing is that I need to somehow map the data coming with the schema and create a document by combining them. Is there a input or maybe a filter plugin that helps me do that.
Example data:

    {
        "f": [
          {
            "v": {
              "f": [
                {
                  "v": "1556330454800"
                },
                {
                  "v": "15452"
                },
                {
                  "v": "true"
                }
              ]
            }
          }
       ]
    }

And the corresponding schema:

    [
      {
        "name": "info",
        "type": "RECORD",
        "mode": "REQUIRED",
        "fields": [
          {
            "name": "stamp_usec",
            "type": "INTEGER",
            "mode": "REQUIRED"
          },
          {
            "name": "time_usec",
            "type": "INTEGER",
            "mode": "NULLABLE"
          },
          {
            "name": "status",
            "type": "BOOLEAN",
            "mode": "REQUIRED"
          }
        ]
      }
    ]

I would like the final document to contain the field name along with its corresponding value in a new json.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.