Logstash - not pushing all data into elasticsearch

I have 7,00,000 records with 300 columns in my database,but when I am pushing it into elasticsearch through logstash,It is pushing records samething like 2,50,000 only.

I have use below in logstash config file

useCursorFetch=true
jdbc_fetch_size => 50000

still not working and not giving any error .

Thanks

Are you using paging?
Do you have any filters?

Posting the configuration would help see what's going on.

Might try adding the below.

jdbc_paging_enabled => true
jdbc_page_size => 50000
1 Like

My logstash config file

input {
jdbc {
jdbc_driver_library =>"mysql-connector-java-8.0.16/mysql-connector-java-8.0.16.jar"
jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/testdb?serverTimezone=Asia/Kolkata&zeroDateTimeBehavior=convertToNull&useCursorFetch=true"
jdbc_user => "root"
jdbc_password => ""
tracking_column => "unix_ts_in_secs"
use_column_value=>true
jdbc_fetch_size => 50000
jdbc_default_timezone => "Asia/Kolkata"
statement => "SELECT
applicationNo,
endDate,
ccmaping.,
UNIX_TIMESTAMP(modifyOn) AS unix_ts_in_secs
FROM
ccmaping
WHERE
(UNIX_TIMESTAMP(modifyOn) > :sql_last_value AND modifyOn < NOW())"
schedule => "
/5 * * * * *"
}
}
filter {
date {
match => [ "endDate", "YYYY-MM-DD hh:mm:ss" ]
timezone => "Asia/Kolkata"
target => "endDate"
}
}
output {
elasticsearch {
document_type => "_doc"
document_id=> "%{id}"
index => "testindex"
hosts => ["[http://localhost:9200"]

}

stdout{
codec => rubydebug
}

}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.