Postgres Logstash JSON parsing with JSON and Ruby

Spent a lot of time trying to debug this simple scenario, so here it is for others to use. I did try to find an answer here but everything was partial or not working, besides a ruby plugin implementation which is overkill.

This is set to import and keep all up to date based on "ru" field which stands for "record updated" and holds precise time of update.

Gotchas: in json field you CAN'T have _id field, it is reserved!

FULLY WORKING CONFIG

input {
 jdbc {
  jdbc_driver_library => "/postgresql-42.2.6.jar"
  jdbc_driver_class => "org.postgresql.Driver"
  jdbc_connection_string => "jdbc:postgresql://localhost:32776/customdbs"
  jdbc_user => "postgres"
  jdbc_password => "root"
  schedule => "* * * * *"	
  statement => "SELECT _id as id, ru, rsearch, rjson::text as rjsontext FROM producers WHERE _id = '5b8ab4e0c3088f8de7a3634d' AND ru > :sql_last_value;"
  use_column_value => true
  tracking_column => "ru"
  tracking_column_type => "timestamp"
  record_last_run => true
  last_run_metadata_path => "record.last"
 }
}

# WORKING CODE BUT OVERKILL
# filter {
#  ruby {
#   code => "
#    require 'json'
#    rjson = JSON.parse(event.get('rjsontext').to_s)
#    event.set('rjson',rjson)
#   "
#  }
# }

# NATIVE IMPLEMENTATION WORKING
filter{
 json{
  source => "rjsontext"
  target => "rjson"
  remove_field => ["rjsontext"]
 }
}

output {
 elasticsearch {
  hosts => "0.0.0.0:50100"
  index => "index1"
  document_id => "%{id}"
  action => "update"
  doc_as_upsert => true
 }
}

Cheers!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.