Logstash multiple json field

I want to send postgres data to elasticsearch using logstash.

I have json column more than one in postgres table.

for example, continent and country column are both json type.

then, How can I configure filter option?

filter {
        json {
               source => "continent"
       }
}

In the document, it explains just one json field.
but I have lots of json field.

Can anybody configure this?
thank you in advance.

in my table, continent and country is json field.

Hi there,

can you post here a sample of what logstash sees when querying your postgres? What does the following pipeline spit out?

input {
  whatever your input is
}

filter {}

output {
  stdout{}
}

Use a second json filter to parse the second column.

1 Like

hey. look. as you see, I'm newbie. without example, You don't have to answer this.

|region_id                                                                                           |continent                                                                                           |country                                                                                             |
|----------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------|
|1000                                                                                                |{"id": "500001", "code": "NA", "name": "북아메리카", "type": "continent", "name_en": "North America"}    |{"id": "201", "code": "US", "name": "미국", "type": "country", "code3": "USA", "name_en": "United Stat|

My example data looks like this.
and continent column is json type.

Here is my current conf.file

input {
      jdbc {
            jdbc_connection_string => "jdbc:postgresql://ip:port/dbname"
            jdbc_user => "user"
            jdbc_password => "password"
            jdbc_driver_library => "/lib/jar_location"
            schedule => "*/15 * * * *"
            statement => "SELECT region_id, continent, country from expedia_region_union order by region_id asc LIMIT 100"
      }
      stdin {
            codec => plain { charset => "UTF-8" }
      }
}

filter {
        json {
              source => "message"
              add_field => { "continent" => "continent" }
        }
        json {
              source => "message"
              add_field => {"country" => "country" }
         }
}

output {
       elasticsearch {
           hosts => ["host"]
           index => "index_name"
           doc_as_upsert => true
           action => "update"
           document_if => "%{region_id}"
     }
     stdout { codec => rubydebug }
}

and this not works. how can I configure correctly to send json columns.

If both the continent and country fields are JSON then use

    json { source => "continent" }
    json { source => "country" }
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.