How to load big table to es with logstash

(Gaobo Yu) #1

Hi :
I hava a mysql table named "orders" with ten million records ,I am trying to import all the data to Es with the plugin logstash-input-jdbc, but it seems cost a long time to finish this task, I have optimized all the params as far as I know,such as the mysql "useCursorFetch=true" ,"jdbc_fetch_size","jdbc_paging_enabled" ,but seems not work as I expected. Could anyone help me solve this problem;Thanks in advance;

my logstash config file is as below:

input {
stdin {
jdbc {
# mysql jdbc connection string to our backup databse
jdbc_connection_string => "jdbc:mysql://"
# the user we wish to excute our statement as
jdbc_user => "user"
jdbc_password => "test"
# the path to our downloaded jdbc driver
jdbc_driver_library => "/home/api/mysql-connector-java-5.1.25.jar"
# the name of the driver class for mysql
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_paging_enabled => "true"
jdbc_fetch_size => "50000"
statement_filepath => "../jdbc.sql"
schedule => "*/2 * * * * *"
type => "jdbc"

filter {
json {
source => "message"
remove_field => ["message"]

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "test"
document_type => "orders"
document_id => "%{id}"

And the jdbc.sql file context is :
select * from orders

(Mark Walkom) #2

How long?

(Gaobo Yu) #3

I have improted 3100000 records to es ,and takes about 50 minutes, the store.size on es is 2.1GB

(Mark Walkom) #4

What sort of monitoring do you have in place to tell if it's the DB or Elasticsearch/Logstash?

(Gaobo Yu) #5

I use the ELK tools with logstash-input-jdbc , there is no other monitoring tools.

(Christian Dahlqvist) #6

I would recommend running the configuration once while replacing the elasticsearch output with e.g. a file output. That way you can check what the throughput of reading from the database is. Also monitor how much CPU you are using while doing this.

(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.