Logstash Docker gives "Certificate verify failed" with Logstash Big Query Output

Hi,

I am trying to use Google Big Query Logstash output in a docker container. After starting the job I get the following errror:

logstash_1 | [2018-03-06T19:57:58,505][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::OutputDelegator:0x5893c3c1 @namespaced_metric=#<LogStash::Instrument::NamespacedMetric:0x7a8f4807 @metric=#<LogStash::Instrument::Metric:0x772a9c6d @collector=#<LogStash::Instrument::Collector:0x104a53c0 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x568107c3 @store=#<Concurrent::Map:0x00000000000fb4 entries=3 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x37b21c13>, @fast_lookup=#<Concurrent::Map:0x00000000000fb8 entries=55 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs, :\"14708533f248ee18126fec935b0d7eb0e94dc0d0a50de18b101efc7435f991df\"]>, @metric=#<LogStash::Instrument::NamespacedMetric:0x4a2d78e5 @metric=#<LogStash::Instrument::Metric:0x772a9c6d @collector=#<LogStash::Instrument::Collector:0x104a53c0 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x568107c3 @store=#<Concurrent::Map:0x00000000000fb4 entries=3 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x37b21c13>, @fast_lookup=#<Concurrent::Map:0x00000000000fb8 entries=55 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs]>, @out_counter=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @strategy=#<LogStash::OutputDelegatorStrategies::Single:0x3c841c48 @mutex=#<Mutex:0x48f26b93>, @output=<LogStash::Outputs::GoogleBigQuery project_id=>\"arched-photon-194421\", dataset=>\"msgs-2018.02.10\", json_schema=>{\"fields\"=>[{\"name\"=>\"_id\", \"type\"=>\"STRING\"}, {\"name\"=>\"_index\", \"type\"=>\"STRING\"}, {\"name\"=>\"msg.message_text\", \"type\"=>\"STRING\"}]}, key_path=>\"/usr/share/logstash/keys/my-project-fcee9ca06100.p12\", key_password=>\"notasecret\", service_account=>\"elastic-search-data@arched-photon-194421.iam.gserviceaccount.com\", date_pattern=>\"%Y-%m-%dT%H:00\", flush_interval_secs=>2, uploader_interval_secs=>60, deleter_interval_secs=>60, id=>\"14708533f248ee18126fec935b0d7eb0e94dc0d0a50de18b101efc7435f991df\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_e94431cd-7ba4-416e-b363-a10348fc9b33\", enable_metric=>true, charset=>\"UTF-8\">, workers=>1, table_prefix=>\"logstash\", table_separator=>\"_\", ignore_unknown_values=>false, temp_file_prefix=>\"logstash_bq\">>, @in_counter=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @id=\"14708533f248ee18126fec935b0d7eb0e94dc0d0a50de18b101efc7435f991df\", @time_metric=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x6939d838 @metric=#<LogStash::Instrument::Metric:0x772a9c6d @collector=#<LogStash::Instrument::Collector:0x104a53c0 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x568107c3 @store=#<Concurrent::Map:0x00000000000fb4 entries=3 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x37b21c13>, @fast_lookup=#<Concurrent::Map:0x00000000000fb8 entries=55 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs, :\"14708533f248ee18126fec935b0d7eb0e94dc0d0a50de18b101efc7435f991df\", :events]>, @output_class=LogStash::Outputs::GoogleBigQuery>", :error=>"certificate verify failed", :thread=>"#<Thread:0x60ea0d79 run>"}

I think this is a bug in the BigQuery Plugin.

I tried to update the ssl certificates in the docker image with no luck.

Any work around would be highly appreciated!

The docker logstash version is 6.2.2 and the bigquery plugin was installed with logstash-plugin install logstash-output-google_bigquery

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.