Terribly long time to load data from MySQL

(Gosforth) #1

It takes two seconds to display data from view in MySQL. But it takes hour (I still wait so I cannot tell it will succeed) logstash starts to load data from this view (since running the command logstash -f config)

input {
  jdbc {
    jdbc_driver_library => "mysql-connector-java-5.1.47-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/data"
    jdbc_user => "someuser"
    jdbc_password => "somepass"
    statement => "SELECT  * FROM some_view"
	date {
		match => ["Order_Date", "yyyy-MM-dd"]
		target => "Order_Date"
		locale => "en"
	mutate {
		rename => {
			"gps_long" => "[location][lon]"
			"gps_lat" => "[location][lat]"
   elasticsearch {
   hosts => ["localhost:9200"] 
   index => "some_index"

   stdout {}


Any trick it works in expected manner?

(Magnus Bäck) #2

How many rows need to be processed?

(Gosforth) #3

Only 582 000 thousands. Not many. Config is working with smaller no of rows - tested.

(Magnus Bäck) #4

Okay, so that's pretty slow. Perhaps ES is underprovisioned? Or what's the bottleneck? ES is fairly memory hungry and will provide very slow performance if I/O is slow.

(Gosforth) #5

This is logstash problem (or JDBC). I see that logstash does not event created index during this time. Test PC I'm running it on has 16GB RAM and i7 processor. Enough. Looks like jdbc_fetch_size and jdbc_page_size do not work. Maybe this JDBC version is outdated? I tried latest one but it does not work at all.

(Magnus Bäck) #6

I see that logstash does not event created index during this time.

There's no index being created in Elasticsearch? That would indicate a problem with the connection to ES. Have you looked in the Logstash log for clues?

(Gosforth) #8

It creates but after long time. What logstash is doing? How it's working? I loads all records into memory and gives some format to data before it transfers them to Elasticsearch? It stopes from time to time and system is busy working on something...

(Magnus Bäck) #9

Apart from the log (you should consider increasing its verbosity), Logstash's monitoring API could lend clues into what's going on.

(Gosforth) #10

How to do that? Some switch in 'logstash' command? Or maybe, better, in config file?

(Magnus Bäck) #11

The log level can be configured either via logstash.yml or a command line option. See the documentation for details.

(system) #12

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.