Sending Data from MySql to ES

Hi ,
i am sending 447 documents from logstash but ES is recieving only 43 ,
my config file :
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://ipaddress:3306/zendb"
jdbc_user => "****"
jdbc_password => "*****"
jdbc_driver_library => "/root/mysql-connector-java-8.0.20/mysql-connector-java-8.0.20/mysql-connector-java-8.0.20.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
statement => "SELECT * FROM zen_orders WHERE current_state = 3"
output {
elasticsearch {
"hosts" => "localhost:9200"
"index" => "zen"
"document_id" => "%{id_order}"
stdout { codec => json_lines }

You are setting the "document_id" => "%{id_order}" in your output plugin. If there are duplicate values for that field in your database then Elasticsearch will update those documents instead of creating duplicates at the time of indexing. Try to run a COUNT DISTINCT on that field in your database and see what number it shows up. If you want all those 447 documents indexed, then remove the document_id setting and let Elasticsearch dynamically generate that for you.

COUNT DISCNIT give me 447 , so it's not a problem of duplication

if you set the output to files, how many documents are created?

even when output is a file : 43 document

then you’re not sending 447 documents to ES, you only sent 43 :slight_smile:

you’re not seeing any jdbc error in the log? if your jdbc input produces different result compared to executing sql statement to the db directly, i would think that’s it’s either the problem in the jdbc input plugin or the jdbc library.

debug mode give me this WARN :
[2020-05-15T08:52:44,754][WARN ][logstash.inputs.jdbc ][main] Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::JavaSql::SQLException: HOUR_OF_DAY: 2 -> 3>}
[2020-05-15T08:52:45,028][DEBUG][logstash.javapipeline ][main] Input plugins stopped! Will shutdown filter/output workers. {:pipeline_id=>"main", :thread=>"#<Thread:0x147a76c5 run>"}
[2020-05-15T08:52:45,043][DEBUG][logstash.javapipeline ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x2f9127bd run>"}
[2020-05-15T08:52:45,120][DEBUG][logstash.javapipeline ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x430cc6c1 run>"}

you had sql error, that’s why the record returned is less than expected. you need to fix sql statement. if the statement is working fine when you connect directly to the database, you probably hit a jdbc driver bug.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.