liweigao
(BlueMonster)
December 9, 2015, 12:09pm
1
I have 500W data in the database, I use logstash input myspl to es, the service will die, what do I need?
Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::JavaSql::SQLException: Incorrect key file for table '/tmp/#sql_5f4_0 .MYI'; try to repair it>, :level=>:warn}
warkolm
(Mark Walkom)
December 10, 2015, 2:50am
2
Providing your config and the version you are running will be helpful.
liweigao
(BlueMonster)
December 10, 2015, 2:55am
3
input {
jdbc {
jdbc_driver_library => "/data/applications/tools/logstash-2.0.0/mysql-connector-java-5.1.28-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://120.3.3.20:3323/_order"
jdbc_user => "admin"
jdbc_password => "admin"
jdbc_paging_enabled => "true"
jdbc_page_size => "5000000"
schedule => "33 * * * *"
statement => "SELECT * from tmp_ticket_order_1509 limit 5000000"
}
}
output {
stdout {codec => rubydebug}
elasticsearch {
hosts => ["localhost:9200"]
index => "mysqltest"
document_type => "pay_order_1512"
document_id => "%{id}"
}
file {
path =>"/data/applications/tools/logstash-2.0.0/logs/test.log"
}
}
this is my config ,logstash version is 2.0 ,elasticsearch is 2.0 too ,thanks you warkolm!
1 Like
warkolm
(Mark Walkom)
December 11, 2015, 2:09am
4
Have you tried reducing the page size, also increasing LS heap may help.
liweigao
(BlueMonster)
December 11, 2015, 4:58am
5
I do that ,but If I'm going to import 500W data, I'm going to write a lot of.Conf files right? thank you warkolm.
warkolm
(Mark Walkom)
December 11, 2015, 10:04pm
6
No, because it will page though the entire table.