Load data from SQL server to Elasticsearch with document_id on local

Hi every one.
My local table has 3000 rows. When I config logstash output with
document_id => "%{countyId}". Just only lastest row is inserted to ElasticSearch.

But when I not use document_id => "%{countyId}" so it work fine.

Please any one help me.
Thank you.

Means that all your SQL rows have the same countyId
Use an other unique ID per row

Hi @ylasri .
column countyId is Primary key in my table. This is my logstash config.
input {
jdbc {
jdbc_driver_library => "C:\ELK\elasticsearch-7.12.0-windows-x86_64\elasticsearch-7.12.0\lib\sqljdbc_9.2\enu\mssql-jdbc-9.2.1.jre8.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string =>"jdbc:sqlserver://BPHHP13:1433;databaseName=WRIPMS;"
jdbc_user => "test"
jdbc_password => "TuanTu2017@)!&"
jdbc_paging_enabled => true
clean_run => true
schedule => "*/5 * * * * *"
statement => "select [countyId], [countyName], [modifiedDate] from Counties where [countyId] > :sql_last_value"
use_column_value => true
tracking_column => "countyId"
}
}
output {
elasticsearch{
hosts => "http://localhost:9200/"
index => "counties_index"
document_id => "%{countyId}"
doc_as_upsert => true
}
stdout {
codec => rubydebug
}
}

Thank you

Try to add this parameter lowercase_column_names => false as follow

input {
	jdbc {
		jdbc_driver_library => "C:\ELK\elasticsearch-7.12.0-windows-x86_64\elasticsearch-7.12.0\lib\sqljdbc_9.2\enu\mssql-jdbc-9.2.1.jre8.jar"
		jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
		jdbc_connection_string =>"jdbc:sqlserver://BPHHP13:1433;databaseName=WRIPMS;"
		jdbc_user => "test"
		jdbc_password => "TuanTu2017@)!&"
		jdbc_paging_enabled => true
		clean_run => true
		schedule => "*/5 * * * * *"
		statement => "select [countyId], [countyName], [modifiedDate] from Counties where [countyId] > :sql_last_value"
		use_column_value => true
		tracking_column => "countyId"
		lowercase_column_names => false
	}
}
output {
	elasticsearch {
	hosts => "http://localhost:9200/"
	index => "counties_index"
	document_id => "%{countyId}"
	doc_as_upsert => true
}
	stdout {
	codec => rubydebug
	}
}
1 Like

Hi @ylasri .
It work fine.
May be you show me that I can check condition before insert or update to elastic search?
Thank you so much.

1 Like

May be this ?

Hello @ylasri , I have defined it using mutate in the filter section. Now it is working fine. Thank you so much for your help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.