Database en elasticsearch

@Diego_Gomez

Pasar toda la tabla? O pasar todos los valores nuevos desde el ultimo registro que ya sincronizo?

Saludos

Pasar los datos actualizados cada minuto, los sensores emiten un dato cada 10 segundos yo quisiera sincronizar cada minuto y copiar los datos esos

Hola todavía no encuentro la manera de darle los parámetros para que logstash tome tome los datos de mi base de datos, si alguien me puede indicar como hacerlo se lo agradezco mucho. Saludos

Hola diego, tienes que utilizar ningun parametro, luego un schedule cada 10 segundos, y finalmente en el statement (SQL query) utilizar sql_last_value para tomar todos los valores que correspondan que el date_time o el valor que guarda en forma de fecha sea mayor a sql_last_value. Pruebalo y te datas cuenta.

Saludos!

Hola Grabriel perdona mi ingnoracia pero no me doy cuenta como hacerlo me podrás dar un ejemplo como seria lo que tengo que escribir, perdón de nuevo esto es nuevo para mi.

schedule => "* * * * *"
statement => "SELECT * from songs where artist = :favorite_artist"
    }

Algo asi como:

schedule => "*/5 * * * *"
statement => "SELECT * from <your_table> where <your_date_field> >= :sql_last_value"
    }

*/5 * * * * es cada 5 minutos.
Statement contiene el comando que necesitas como ejemplo.

Saludos!

Gracias voy a intentar.

Hola me da este error

[2016-11-29T13:50:57,538][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2016-11-29T13:55:00,241][INFO ][logstash.inputs.jdbc ] (0.036000s) SELECT version() AS v LIMIT 1
[2016-11-29T13:55:00,333][ERROR][logstash.inputs.jdbc ] Java::ComMysqlJdbcExceptionsJdbc4::MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ' where <2016-11-29 13:47:11> >= '1970-01-01 00:00:00'' at line 1: SELECT * from where <2016-11-29 13:47:11> >= '1970-01-01 00:00:00'
[2016-11-29T13:55:00,346][WARN ][logstash.inputs.jdbc ] Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::ComMysqlJdbcExceptionsJdbc4::MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ' where <2016-11-29 13:47:11> >= '1970-01-01 00:00:00'' at line 1>}

Tenes que parsear la fecha tuya o de alguna manera para que sea comparacion de fechas como maneja en: http://stackoverflow.com/a/33803288.

Otros ejemplos: http://stackoverflow.com/questions/29245754/mysql-where-clause-with-date-format

Hola esta es mi configuración

input {
  jdbc {
 jdbc_driver_library => "/home/rack/mysql-connector-java-5.1.40/mysql-connector-java-5.1.40-bin.jar"
 jdbc_driver_class => "com.mysql.jdbc.Driver"
 jdbc_connection_string => "jdbc:mysql://10.10.0.132:3306/testdb"
 jdbc_user => "user"
 jdbc_password => "rack"
schedule => "*/1 * * * *"
statement => "select id, DATE_FORMAT(timestamp, '%Y-%m-%d %T') AS   id, id_wasp, id_secret, frame_type, frame_number, sensor, value, timestamp, raw, parser_type, trans_id from testdb where timestamp >= :sql_last_start"

 }
 }

filter {
   date {
   locale => "en"
   timezone => "ART"
   match => [ "timestamp", "YYYY-MM-dd HH:mm:ss" ]
   target => "@timestamp"
  }
}

output {
   elasticsearch {
   hosts => ["10.10.0.132:9200"]
   index => "audit-%{+YYYY.MM.dd}"
   document_id => "%{id}"
}
 stdout {
    codec => rubydebug
  }
}

Este es lo que me tira

[2016-11-29T15:29:30,222][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["10.10.0.132:9200"]}
[2016-11-29T15:29:30,256][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["/home/rack/logstash-5.0.0/vendor/bundle/jruby/1.9/gems/logstash-filter-date-3.0.3/lib/logstash/filters/date.rb:297:in setupMatcher'", "org/jruby/RubyArray.java:1613:ineach'", "/home/rack/logstash-5.0.0/vendor/bundle/jruby/1.9/gems/logstash-filter-date-3.0.3/lib/logstash/filters/date.rb:224:in setupMatcher'", "/home/rack/logstash-5.0.0/vendor/bundle/jruby/1.9/gems/logstash-filter-date-3.0.3/lib/logstash/filters/date.rb:188:inregister'", "/home/rack/logstash-5.0.0/logstash-core/lib/logstash/pipeline.rb:197:in start_workers'", "org/jruby/RubyArray.java:1613:ineach'", "/home/rack/logstash-5.0.0/logstash-core/lib/logstash/pipeline.rb:197:in start_workers'", "/home/rack/logstash-5.0.0/logstash-core/lib/logstash/pipeline.rb:153:inrun'", "/home/rack/logstash-5.0.0/logstash-core/lib/logstash/agent.rb:250:in `start_pipeline'"]}
[2016-11-29T15:29:30,348][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2016-11-29T15:29:33,278][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}

Hola Diego!

Aqui ya es un problema del date filter. Pareceria que no le gusta el locale o el timezone.

Creo que el timezone esta incorrecto. No existe un ART timezone, deberia de ser lo siguiente:

   date {
   locale => "en"
   timezone => "America/Buenos_Aires"
   match => [ "timestamp", "YYYY-MM-dd HH:mm:ss" ]
   target => "@timestamp"
  }

Saludos!

--Gabriel

Gabriel me esta pidiendo esto

Compruebe el manual que corresponde a su versión del servidor MySQL para la sintaxis correcta para usar cerca de ': sql_last_start' en la línea 1>}

hola gabriel ahi logre que me diera datos el logstash pero me dio la primera ves y despues me tira esto

"@timestamp" => 2016-11-30T14:31:25.789Z,
   " @version" => "1",
  "sensor" => "TCA",
      "id" => "2016-11-29 17:33:23",
   "value" => "22.26",
"timestamp" => 2016-11-29T20:33:23.000Z
 }
{
 "id_secret" => "408425467",
     "raw" => "noraw",
"parser_type" => 1,
"frame_type" => 128,
"frame_number" => 204,
 "id_wasp" => "Waspmote_PRO",
    "tags" => [
[0] "_dateparsefailure"
],
 "@timestamp" => 2016-11-30T14:31:25.789Z,
"@version" => "1",
  "sensor" => "TCA",
      "id" => "2016-11-29 17:33:29",
   "value" => "22.26",
 "timestamp" => 2016-11-29T20:33:29.000Z
}
[2016-11-30T11:32:00,309][INFO ]logstash.inputs.jdbc select id, DATE_FORMAT(timestamp, '%Y-%m-%d %T') AS id, id_wasp, id_secret, frame_type, frame_number, sensor, value, timestamp, raw, parser_type, timestamp from testdb.sensorParserwhere timestamp >= '2016-11-30 14:31:26'

Ahora solo me muestra esto

[2016-11-30T12:23:00,098][INFO ][logstash.inputs.jdbc     ] (0.084000s) select id, DATE_FORMAT(timestamp, '%Y-%m-%d %T') AS id, id_wasp, id_secret, frame_type, frame_number, sensor, value, timestamp, raw, parser_type, timestamp from `testdb`.`sensorParser`where timestamp  >= '2016-11-30 15:22:00'
[2016-11-30T12:24:00,144][INFO ][logstash.inputs.jdbc     ] (0.072000s) select id, DATE_FORMAT(timestamp, '%Y-%m-%d %T') AS id, id_wasp, id_secret, frame_type, frame_number, sensor, value, timestamp, raw, parser_type, timestamp from `testdb`.`sensorParser`where timestamp >= '2016-11-30 15:23:00'

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.

Hola @Diego_Gomez

Alguna noticia de esto? Saludos!