Read column of datatype "JSON" from database using Logstash

I have developed a script to read table from Postgres Database and load the data in a json format inside Kafka topic. The problem is one of the columns (cust_record) in the table is JSON type and I could not read that column as json using logstash. So I converted Json type into text type and then loaded into kafka.

Now, I need to extract the value from the one of keys of that Json field. Since I converted into text, I am not sure how to parse that text as json inside filter plug in.

Note: If I try to read json column from table as it is, I got this error "exception=> "java::OrgLogstash::MissingConverterException: Missing Converter handling for full class name = org.postgressql.util.PGobject"

In kafka topic, the column (cust_record) looks like this

"cust_record" : "{nested_json}"

How can I extract values inside json keys when that json has been converted into text while reading from source table? or
is there anyway possible that I can read the column as type "Json" itself without converting as text? so that inside filter plug in, I can extract value out of json keys?

Kindly help. I am a newbie on this technology. Appreciate your assistance on this problem.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.