JDBC Static Filter Plugin - Error handling, how to skip enrichment when Database is down

Hi,

We've got logstash fetching some information from a MySQL database for log enrichment via JDBC Static Filter Plugin.

The settings work just fine when things are working as expected:

...
   jdbc_static
   {
      loaders =>[
                  {
                    id => "user-details"
                    query => "ENRICHMENT_DATA_FETCH_SQL_QUERY"
                    local_table => "enrichmentdata"
                  }
                ]

      local_db_objects => [
                          {
                            name => "enrichmentdata"
                            index_columns => ["ip_address"]
                            columns =>[
                                            ...
                                      ]
                          }
                        ]
      local_lookups => [
                        {
                          id => "local-enrichmentdata"
                          query => "QUERY"
                          prepared_parameters => ["[srcip]"]
                          target => "enrichmentuser"
                        }
                      ]
      add_field => { enrichment_name => "%{[enrichmentuser][0][name]}"}
      remove_field => ["enrichmentuser"]
      staging_directory => "/tmp/logstash/jdbc_static/import_data"

      jdbc_user => "USERNAME"
      jdbc_password => "PASSWORD"
      jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
      jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/mysql-connector-j-8.0.32.jar"
      jdbc_connection_string => "jdbc:mysql://MySQL-Server-IP:3306/enrichment"
    }
  }
...

However, Logstash stops working when the MySQL database becomes unreachable.

[ERROR][logstash.javapipeline    ][main] Pipeline error {:pipeline_id=>"main", :exception=>#<LogStash::Filters::Jdbc::ConnectionJdbcExce
ption: Java::ComMysqlCjJdbcExceptions::CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.>, :backtrace=>["com.mysql.cj.jdb
c.exceptions.SQLError.createCommunicationsException(com/mysql/cj/jdbc/exceptions/SQLError.java:174)", 

How can error handling be performed so that logstash skips the JDBC Static filter plugin and continues sending logs to Elasticsearch in such cases?

I do not believe that is possible.

So we added a second MySQL DB (standby) which is used to fetch details from the main MySQL DB (main).

The standby DB on the same machine as Logstash.

Logstash fetches details for enrichment from the standby DB.

The content of the standby DB is synched with the main DB at regular intervals.

In case the main DB is unavailable, Logstash still functions as the standby DB is available and logstash gets its data from the standby DB at all times.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.