Disable logstash logger for inputs.jdbc

Hi all,

I am using Logstash 7.12.1 to update an elasticsearch 7.12.1 index from an oracle database. I have no problem establishing communication with the DB using the below input jdbc declaration:

input {
   jdbc { 
      jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
      jdbc_driver_library => "<PATH_TO_DRIVER>"
      jdbc_connection_string => "jdbc:oracle:thin:@<IP>:<PORT>/<SERVICE>"
      jdbc_user => "<USER>"
      jdbc_password => "<PASSWORD>"
      statement_filepath => "<update_sql_path>"
      schedule => "* * * * *"
      last_run_metadata_path => "<last_run_path>"
  }
}

However, my sql command is very long. As you can maybe tell from the input, this is a scheduled update job, and this long sql command gets written to my log every time it is run. This creates extremely large daily logs that are impractical, not only in size, but also as far as readability is concerned; More than 90% of the content is the same SQL command repeated over and over again!

I've attempted changing the "sql_log_level", but to no avail. It is still written just under a different log level.

Does anyone perhaps have a solution for this problem?

Best,

Chris

What message is logged? Is it "Executing JDBC query"?

Change the loglevel in logstash.

Your request is something like this:

curl -XPUT 'localhost:9600/_node/logging?pretty' -H 'Content-Type: application/json' -d'{ "logger.logstash.inputs.jdbc" : "ERROR" }'

This will make the inputs.jdbc to log only on ERROR levels.

1 Like

After every query, the SQL string is written to the log. This spans something around 20 lines and makes finding more informative parts of the log, such as what records were updated / deleted / inserted a bit tedious.

Thank you! This is what I was going for. I couldn't find this anywhere in the documentation unfortunately. If I missed some references to controlling logger output / setting custom logging behavior, I would appreciate any links you can add here!

For this method to work, I had to start my logstash service first, which started it writing to my log file. Then upon changing the log level to "error", it ceased writing my long sql command. Would it be possible to configure this prior to starting the service? In a sort of config file perhaps?

Ultimately, my goal is to have custom logging to store information regarding what logstash communicates to my ES index. Something in the form of:
[@timestamp] [LOG_LEVEL] [logstash.outputs.elasticsearch] "Records (RECORD_ID) updated/inserted/deleted: => List of event keys and values"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.