Logstash filter to extract string from Json in postgresql table


I am using jdbc plugin to get data out of postgresql, but one of my column is having many values as below with 100 unique field names.

I am looking to extract string from json format by logstash filter.

 id   |                   columns                    |         timestamp          | query_id             |               task_id
     1 | {"uid": "112", "name": "redis-server"}       | 2018-07-18 18:45:39.045387 |                    1 |                         2
     2 | {"uid": "0", "name": "celery"}                 | 2018-07-18 18:45:39.047671 |                    1 |        


{"uid": "112", "name": "redis-server"}

output to be like:

uid: 112
name: redis-server

Thanks in advance

A json filter should handle this.

Hi @Badger , i tried below is not working:

filter {
json{source => "columns"

I am getting [WARN ][logstash.inputs.jdbc ] Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::OrgLogstash::MissingConverterException: Missing Converter handling for full class name=org.postgresql.util.PGobject, simple name=PGobject>}

That's a problem with the input. You don't even have an event at that point. I would ask a new question about Java::OrgLogstash::MissingConverterException

Hi @Badger I have this issue when i try to import "columns" in above table, if i exclude "columns"(containing json i dont have any error message), i am able to pull data

As I said, I would ask a new question about Java::OrgLogstash::MissingConverterException. You need to get the input working before you worry about the filter.

A PGobject is used to represent an unknown type, so the input does not know what to do with it. It's probably related to the fact that the column contains JSON. Perhaps you need to CAST the column (because you want a string to input to the json filter). I don't know. I don't use the jdbc input.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.