JDBC, Putting result of SQL query into one field


(Alexander) #1

Hello everyone!
I have a problem with configuring the logstash 5.2.
I use jdbc-input plugin, And I want to put the result of the SQL-query in to the one field called "query_result". But I don't know how. Can you help me?

My input:

input {
  jdbc {
      jdbc_driver_library => "/usr/share/logstash/mysql-connector-java-5.1.40/mysql-connector-java-5.1.40-bin.jar"
      jdbc_driver_class => "com.mysql.jdbc.Driver"
      jdbc_connection_string => "jdbc:mysql://some_url/"
      jdbc_user => "some_user"
      jdbc_password => "some_password"
      schedule => "* * * * *"
      statement => "SELECT * FROM my_table WHERE wid > :sql_last_value"
      tracking_column => "wid"
      tracking_column_type => "numeric"
      use_column_value => "true"
      codec => "plain"
      add_field => { "[@metadata][input]" => "jdbc"}
    }
}

In result I have event like this:

{
  "wid":1490509,
  "@timestamp":"2017-02-02T09:04:00.284Z",
  "@version":"1",
  "field_1":"value_1", 
  "field_2":"value_2", 
  ... ,
  "field_n":"value_n", 
  "tags":["jdbc"]
}

Where field_1, ... field_n are columns of my_table.

But I want see the next result:

{
  "wid":1490509,
  "@timestamp":"2017-02-02T09:04:00.284Z",
  "@version":"1",
  "query_result": {
    "field_1":"value_1", 
    "field_2":"value_2", 
    ... ,
    "field_n":"value_n",
  },
  "tags":["jdbc"]
}

(Guy Boertje) #2

The jdbc input does not allow this. If you really need this you will need to modify a fork of the library and run with that.

I can help with the code that needs to change - its a few extra characters in one line. That is the easy bit. The harder bits are knowing what to do in github and getting Logstash to install your fork.

Alternatively you can use the ruby filter to stitch back the keys and values - performance will go down and the ruby code is a bit tricky especially because it needs to written as a one liner.

How good is your Github knowledge and Ruby coding skills?


(Alexander) #3

Thanks for the answer.
In my opinion, the first variant with own fork of the jdbc plugin is hard. It required always updgrading fork of plugin jdbc when logstash is updating. Unfortunately, I'm not good in Ruby. I don't think that I can resolve this problem myself.

Maybe, there is a solution with standart logstash features. For example, I can send jdbc generated event to logstash pipeline again. Then jdbc generated event will be nested in field "message". Then I can rename field "message" to field "result_query" by mutate filter. Or maybe there's a simpler solution.


(Guy Boertje) #4

I can't see that working. The jdbc input will create an event from each row, in the event each column name will be a key and the column value will be a value in the event at the root level.

It will be hard to know what the collection of column names are outside of the jdbc input code.


(Fabien Baligand) #5

You can add a mutate filter to do that :

mutate {
rename => { "field1" => "[query_result][field1]" }
}


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.