Logstash - do not starts when jdbc filter fails connection


I've created a logstash input filter:

    filter {
      jdbc_static {
        loaders => [
            id => "remote-ipsservers"
            query => "select address, display_name from host_list'"
            local_table => "host_list"
        local_db_objects => [
            name => "host_list"
            index_columns => ["address"]
            columns => [
              ["address", "varchar(255)"],
              ["display_name", "varchar(255)"]
        local_lookups => [
            id => "local-ipsservers"
            query => "select display_name as description from host_list WHERE address = :ip"
            parameters => {ip => "[host][ip]"}
            target => "probes"
    # reload database every our
    loader_schedule => "00 * * * *"
    jdbc_user => "logstash"
    jdbc_password => "12345678"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_driver_library => "/extras/mysql-connector-java-5.1.46.jar"
    jdbc_connection_string => "jdbc:mysql://"
      # If found, add name
      if [probes][0][description] {
        mutate {
        # using add_field here to add & rename values to the event root
          add_field => { nombre_sonda => "%{[probes][0][description]}" }
      # If not, write a default name
      else {
        mutate {
          add_field => { nombre_sonda => "unknown" }
      # we use probes just temporarily and we don't need it here anymore
      mutate {
        remove_field => ["probes"]

I'm receiving netflow packets, and then I do a lookup of the host.ip parameter in netflow to a remote mysql database to query for the name for that ip. The filter works perfect, but if I start logstash and the mysql server is down, or fails authentication, logstash just shut down with the following logs:

    [ERROR][logstash.agent           ] Failed to execute action {:id=>:flows, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<flows>, action_result: false", :backtrace=>nil}
    [INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
    [INFO ][logstash.runner          ] Logstash shut down.

Is there any logstash parameter to disable shutdown when the jdbc driver fails connection? As I said, that's not a syntax error, it fails only when the remote mysql is unavailable



Hi Cesar,

looking to the documentation:

It looks to me like the local_lookups wrong.

Would you mind to change:
parameters => { ip => "[host][ip]"}
parameters => { "ip" => "%{[host][ip]}"}

What about the outcome. does it work for you?

Let me know how it goes.

If forgot to mention you might add a conditional around the JDBC filter to add the notion of,
if the host.ip field doesn't exist, don't do the lookup action:

if [host][ip] {

this means if field host.ip exists or is not null, do the jdbc lookup action.

Thanks Andre for the suggestions. But that's not the problem I reported: when the mysql server is not up, the logstash startup fails and it just shutdowns


Yeah, sorry I was a bit too focused on the errors mentioned.
So if you startup logstash and at that time MySQL is not available, I don't think there is much we could do, as JDBC relies on sequel component,

which itself relies on the Java methods to deal with JDBC connections.
So it might be better to rely on HA availability of the MySQL database like Master/Slave or Galera Server.
If the MySQL connection itself gets unavailable during logstash is running, this should be keeping running.
Some parameters adding to the jdbc:MySQL URL should help at the end: &autoReconnect=true&failOverReadOnly=false&maxReconnects=10
and at least I am getting before this nil error you mentioned the original exception which I a little bit formatted for you in next comment


Ok, in the meanwhile I did some additional digging.
What you normally do if pipelines cannot load is to enable automatic reloading of your pipeline, which can help your logstash on:

config.reload.automatic: true

Additionally you can still rectify the situation by splitting this pipeline to a pipeline to pipeline configuration, but you still need to have the reload enabled.
However this particular one if you would isolate the JDBC part and the output into a separate pipeline will help to have the input port of logstash online, even if the JDBC MySQL port would not be available. Once it is available again, it will kick in by the config reload setting:

If you still have questions, please let me know.