JDBC Plugin - Missing Converter handling for full class name=org.postgresql.util.PGobject, simple name=PGobject

Hello,
I am trying to determine how to get the JDBC plugin to correctly load JSON and JSONB field types from PostgreSQL. I am running Postgres 9.5 and Logstash 6.5. Here is an example to replicate.

create table if not exists public.test_products
(
product_id bigserial not null
constraint test_products_pkey
primary key,
product_name text,
options json not null,
created timestamp with time zone default now()
);

INSERT INTO public.test_products (product_name, options) VALUES ('caticorn shirt', '{"color": "green", "size": "large"}');

And my logstash config.

input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://localhost:5432/shop1"
jdbc_user => "user"
jdbc_password => "*******"
jdbc_validate_connection => true
jdbc_driver_library => "/opt/drivers/postgresql-42.2.5.jar"
jdbc_driver_class => "org.postgresql.Driver"
statement => "SELECT product_id, product_name, options FROM public.test_products"
}
}
output {
elasticsearch {
index => "products"
document_type => "_doc"
hosts => ["localhost:9200"]
}
}

Running Logstash with this config generates the following error

Missing Converter handling for full class name=org.postgresql.util.PGobject, simple name=PGobject

which is due to the inclusion of the "options" JSON field (verified by removing if from the select statement and restarting Logstash successfully).

I have also tried to convert the options to a TEXT type (options::TEXT) and specify the field as "nested" in a template but the index mapping is always "text". I am assuming that this is due to the format of the field not being valid JSON due to the conversion.

Has anyone else run into this and found a solution or is there a suggested methodology for importing JSON and JSONB fields into Elastic using the Logstash JDBC plugin?

-James

2 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.