Logstash jdbc for mysql


(BlueMonster) #1

I have 500W data in the database, I use logstash input myspl to es, the service will die, what do I need?

Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::JavaSql::SQLException: Incorrect key file for table '/tmp/#sql_5f4_0.MYI'; try to repair it>, :level=>:warn}


(Mark Walkom) #2

Providing your config and the version you are running will be helpful.


(BlueMonster) #3

input {
jdbc {
jdbc_driver_library => "/data/applications/tools/logstash-2.0.0/mysql-connector-java-5.1.28-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://120.3.3.20:3323/_order"
jdbc_user => "admin"
jdbc_password => "admin"
jdbc_paging_enabled => "true"
jdbc_page_size => "5000000"
schedule => "33 * * * *"
statement => "SELECT * from tmp_ticket_order_1509 limit 5000000"
}
}
output {
stdout {codec => rubydebug}
elasticsearch {
hosts => ["localhost:9200"]
index => "mysqltest"
document_type => "pay_order_1512"
document_id => "%{id}"
}

file {

path =>"/data/applications/tools/logstash-2.0.0/logs/test.log"

}

}

this is my config ,logstash version is 2.0 ,elasticsearch is 2.0 too ,thanks you warkolm!


(Mark Walkom) #4

Have you tried reducing the page size, also increasing LS heap may help.


(BlueMonster) #5

I do that ,but If I'm going to import 500W data, I'm going to write a lot of.Conf files right? thank you warkolm.


(Mark Walkom) #6

No, because it will page though the entire table.


(BlueMonster) #7

thank you


(system) #8