I am using logstash to load multi table from mysql to elasticsearch.As suggested I will create index per table.
Got two problems:
-
I have to comment
if [type] == "activity" {
,then the data will successfully imported,otherwise there will not have a index named 'activity'. -
I had created the template.json file,but the logstash seems to parse columns in it's own way.That is to say my custom mapping didn't work.
Thanks in advance!
input {
jdbc {
jdbc_driver_library => "/root/logstash-6.3.0/mysql-connector-java-8.0.11.jar"
jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/dbslave0608"
jdbc_user => "root"
jdbc_password => "Oh-my-sql"
jdbc_default_timezone => 'Asia/Shanghai'
jdbc_paging_enabled => true
last_run_metadata_path => "jdbc_last_run_activity"
schedule => "* * * * *"
statement => 'select * from activity where update_time > :sql_last_value'
tracking_column => 'update_time'
use_column_value => true
type => activity
}
}
output {
#if [type] == "activity" {
elasticsearch {
hosts => "https://localhost:9200"
index => "test"
document_id => "%{id}"
user => "logstash"
password => "myawesomepassword"
cacert => "/root/logstash-6.3.0/config/root-ca.pem"
template_name => 'activity'
template => '/root/logstash-6.3.0/config/mappings/activity.json'
}
# }
stdout {
codec => rubydebug
}
}