Logstash how to handle null value with jdbc_streaming

Hi All,

I use filter jdbc_streaming because I've to extract data from two distinct database based on one column here o_id (similar to left outer join in SQL)


input {
  jdbc{
   statement => "select col1,o_id from table 1"
  }
  ...
}
filter {
  jdbc_streaming {
    statement => "select col2col3 from table 2 where col4 = :p_id"
    parameter => {"o_id" => "p_id"}
    target => "toto"
    default_hash => {
      "col2" => ""
      "col3" => ""
    }
  }
}

mutate {
  add_fied => {
    "col2" => "%{[toto][0][col2]}"
    "col3" => "%{[toto][0][col3]}"
  }
  remove_field => ["toto"]
}

elasticsearch {
  ...
  index => "my_index"
  document_id => "%{col1}"
  ...
}

That means that sometimes the two col2 and col3 could be empty.
Currently, when col2 is null or empty I've %{[toto][0][col2]} in my document instead of null.

I follow this topic How to handle empty result in jdbc_streaming filter plugin in Logstash - #4 by naveen007 but something escapes me.

I work with the stack ELK 7.0.1.
If someone has an idea

Regards,
Julien

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.