Plugin JDBC


(Diego Gomez) #1

Hello, to a query because I can not make JDBC work, in the data where this "?" I do not know what to put so that logstash I take the data that are recorded in the database every 10 seconds, could help me to see the subject. Thank you

input {
  jdbc {
  jdbc_driver_library => "mysql-connector-java-5.1.36-bin.jar"
  jdbc_driver_class => "com.mysql.jdbc.Driver"
  jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb"
  jdbc_user => "mysql"
  parameters => { "???????????????" }
  schedule => "???????????"
  statement => "??????????????"
 }
}

(Magnus Bäck) #2
  • parameters is optional.
  • I'm not sure the schedule syntax supports finer resolution than one minute.
  • statement should contain the query that fetches the data. Without further details it's obviously hard to help. Since you probably don't want to fetch all the data every time you should look into the sql_last_value parameter that's described in a couple of places in the documentation.

(Diego Gomez) #3

Hi this is the logstash code and below this is the error that gives me

 input {
   jdbc {
   jdbc_driver_library => "/home/rack/mysql-connector-java-5.1.40/mysql-connector-java-5.1.40-bin.jar"
   jdbc_driver_class => "com.mysql.jdbc.Driver"
   jdbc_connection_string => "jdbc:mysql://10.10.0.132:3306/testdb"
   jdbc_user => "user"
   jdbc_password => "rack"
   schedule => "*/1 * * * *"
   statement => "select id, DATE_FORMAT(timestamp, '%Y-%m-%d %T') AS id, id_wasp, id_secret, frame_type, frame_number, sensor, value, timestamp, raw, parser_type, trans_id from  `testdb`. `sensorParser` where timestamp >= :sql_last_value"
  }
}

filter {
   date {
   locale => "en"
   timezone => "America/Buenos_Aires"
   match => [ "timestamp", "YYYY-MM-dd HH:mm:ss" ]
   target => "@timestamp"
 }
}

output {
 elasticsearch {
   hosts => ["10.10.0.132:9200"]
   index => "audit-%{+YYYY.MM.dd}"
   document_id => "%{id}"
}
 stdout {
   codec => rubydebug
}
}

[2016-11-30T11:20:00,235][WARN ][logstash.inputs.jdbc ] Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::ComMysqlJdbcExceptionsJdbc4::MySQLSyntaxErrorException: Unknown column 'trans_id' in 'field list'>}
[2016-11-30T11:21:00,293][ERROR][logstash.inputs.jdbc ] Java::ComMysqlJdbcExceptionsJdbc4::MySQLSyntaxErrorException: Unknown column 'trans_id' in 'field list': select id, DATE_FORMAT(timestamp, '%Y-%m-%d %T') AS id, id_wasp, id_secret, frame_type, frame_number, sensor, value, timestamp, raw, parser_type, trans_id from testdb.sensorParserwhere timestamp >= '1970-01-01 00:00:00'
[2016-11-30T11:21:00,297][WARN ][logstash.inputs.jdbc ] Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::ComMysqlJdbcExceptionsJdbc4::MySQLSyntaxErrorException: Unknown column 'trans_id' in 'field list'>}


(Magnus Bäck) #4

Unknown column 'trans_id' in 'field list'

Is this not clear?


(Diego Gomez) #5

Hello, I already changed it and it walks, but I throw this data the first time and then I do not throw more data, just this

"@timestamp" => 2016-11-30T14:31:25.789Z,
    "@version" => "1",
      "sensor" => "TCA",
          "id" => "2016-11-29 17:33:23",
       "value" => "22.26",
   "timestamp" => 2016-11-29T20:33:23.000Z
}
{
   "id_secret" => "408425467",
         "raw" => "noraw",
 "parser_type" => 1,
  "frame_type" => 128,
"frame_number" => 204,
     "id_wasp" => "Waspmote_PRO",
        "tags" => [
    [0] "_dateparsefailure"
 ],
  "@timestamp" => 2016-11-30T14:31:25.789Z,
    "@version" => "1",
      "sensor" => "TCA",
          "id" => "2016-11-29 17:33:29",
       "value" => "22.26",
   "timestamp" => 2016-11-29T20:33:29.000Z
}

[2016-11-30T11:32:00,309][INFO ][logstash.inputs.jdbc ] (0.067000s) select id, DATE_FORMAT(timestamp, '%Y-%m-%d %T') AS id, id_wasp, id_secret, frame_type, frame_number, sensor, value, timestamp, raw, parser_type, timestamp from testdb.sensorParserwhere timestamp >= '2016-11-30 14:31:26'


(Diego Gomez) #6

Please if someone knows what error you could have in the note that you leave later


(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.