Hi :
I hava a mysql table named "orders" with ten million records ,I am trying to import all the data to Es with the plugin logstash-input-jdbc, but it seems cost a long time to finish this task, I have optimized all the params as far as I know,such as the mysql "useCursorFetch=true" ,"jdbc_fetch_size","jdbc_paging_enabled" ,but seems not work as I expected. Could anyone help me solve this problem;Thanks in advance;
my logstash config file is as below:
input {
stdin {
}
jdbc {
# mysql jdbc connection string to our backup databse
jdbc_connection_string => "jdbc:mysql://dev.mysql.xxx.so:3306/test?useCursorFetch=true"
# the user we wish to excute our statement as
jdbc_user => "user"
jdbc_password => "test"
# the path to our downloaded jdbc driver
jdbc_driver_library => "/home/api/mysql-connector-java-5.1.25.jar"
# the name of the driver class for mysql
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_paging_enabled => "true"
jdbc_fetch_size => "50000"
statement_filepath => "../jdbc.sql"
schedule => "*/2 * * * * *"
type => "jdbc"
}
}
filter {
json {
source => "message"
remove_field => ["message"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "test"
document_type => "orders"
document_id => "%{id}"
}
And the jdbc.sql file context is :
select * from orders