Load more than 1 million row from SQL Server to ES with Logstash

Hi every one.
I have one case that my table has more 1 million rows in SQL Server and I used logstash JDBC to load them to Elastic Search. It work fine but run very slow.
This is my file config.

input {
jdbc {
jdbc_driver_library => "C:\ELK\elasticsearch-7.12.0-windows-x86_64\elasticsearch-7.12.0\lib\sqljdbc_9.2\enu\mssql-jdbc-9.2.1.jre8.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string =>"jdbc:sqlserver://BPHHP13:1433;databaseName=WRIPMS;"
jdbc_user => "test"
jdbc_password => "TuanTu2017@)!&"
jdbc_paging_enabled => true
clean_run => true
schedule => "*/5 * * * * *"
statement => "select [countyId], [countyName], [modifiedDate] from Counties where [modifiedDate] > :sql_last_value"
use_column_value => true
tracking_column => "modifiedDate"
lowercase_column_names => false
}
}
output {
elasticsearch{
hosts => "http://localhost:9200/"
index => "counties_index"
action => "update"
doc_as_upsert => true
document_id => "%{countyId}"
}
stdout {
codec => rubydebug
}
}

Please any one help me.
Thank you.

Hey

You already opened the same question at Load more than 1 million row from SQL Server to ES with Logstash JDBC

Let's keep the discussion in one single place. Thanks!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.