Hello,
I have started playing with Elastic Stack two days ago and have fallen completely in love with it!
However, since last night my import stopped working.
I've been creating and deleting indexes and changing Mappings in my config file over and over, but it can't get it to start importing more than the 588460 rows it has done so far.
There is no error, it runs through everything fine, like it has been doing for two days... But it doesn't add any more rows to my index.
There are a little over 4 million rows in my MySQL table, and they have been imported all successfully before (yesterday). However, I had some issues with my data and the field mappings, so I was tinkering around. No matter if I delete the index, recreate it, push the records... it keeps only importing 588460 rows, no matter what I do.
I thought it might be related to the "record_last_run" data, or something.. but it doesn't seem to make a difference..
Here is my config file. Any thoughts?
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:8889/test"
jdbc_user => "root"
jdbc_password => "root"
jdbc_driver_library => "/Users/canphaz/mysql-connector-java-5.1.41/mysql-connector-java-5.1.41-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
clean_run => false
record_last_run => true
use_column_value => true
jdbc_fetch_size => 100000
jdbc_paging_enabled => true
jdbc_page_size => 100000
tracking_column => id
last_run_metadata_path => "/Users/canphaz/.logstash_jdbc_last_run_testtable"
statement => "SELECT * FROM testtable"
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "localhost:9200"
"index" => "testtable"
"user" => "elastic"
"password" => "changeme"
"document_type" => "data"
"document_id" => "%{id}"
}
}