A Google Bigquery output plugin error

Hello!
The old field data I read from Kafka is object,It seems that there is no way to identify the setting record

This is my first configuration

csv_schema => "old:STRING,...."

And first error

[2022-12-12T19:06:04,154][WARN ][logstash.outputs.googlebigquery][main] Error while inserting {:key=>0, :location=>"old", :message=>"This field: old is not a record.", :reason=>"invalid"}
[2022-12-12T19:06:04,166][INFO ][logstash.outputs.googlebigquery][main] Problem data is being stored in: /opt/module/bqerror/logstash_test_2022_12-1670843164.log

And I changed it to record

csv_schema => "old:RECORD,...."

It returned me this error again

[image]

[ERROR][logstash.javapipeline    ][main] Pipeline error {:pipeline_id=>"main", :exception=>java.lang.IllegalArgumentException: The RECORD field must have at least one sub-field, :backtrace=>["com.google.cloud.bigquery.Field$Builder.setType(com/google/cloud/bigquery/Field.java:132)"

The value in old may be one or more, which is uncertain。
what should i do?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.