Running Logstash only 1 row affected to elasticsearch


I run the logstash via command prompt :
logstash -f logstash.conf

input {
jdbc {
jdbc_driver_library => "C:\ELASTIC\sqljdbc_6.0\enu\jre8\sqljdbc42.jar"
jdbc_driver_class => ""
jdbc_connection_string => "jdbc:sqlserver://server;databaseName=database;"
jdbc_user => "user"
jdbc_password => "pwd"
statement => "select top 200 * from table"
filter {

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "oph2"
document_id => "%{id2}"
stdout { codec => rubydebug }

The logstash run well, since i saw the 200 row displayed in command prompt. But when I check in elasticsearch why only 1 row affected to elasticsearch, and the rest 199 were deleted, by checking in Kibana:

"stats": {
"primaries": {
"docs": {
"count": 1,
"deleted": 199

Is there anyone has the solution regarding this? Why the 199 rows were deleted?
Thanks in advance

Check what the ID of the document is in Elasticsearch. If your data does not contain an id2 field, document_id => "%{id2}" will result in all documents getting that exact string %(id2) as Id, which will create the document the first time and then update it 199 times.

Hi Christian,

Many thanks for your advice. After I changed the document_id to the field in my data, all rows were successfully inserted to elasticsearch. thanks for saving my time.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.