Im using the Below JDBC code in Logstash for updating the already existing index in Elasticsearch, without duplicating rows or adding the updated row as another new row.
So my issue is,
The updated old row does not get updated. It is getting created as a new row.
Versions: Elasticsearch, Logstash and Kibana are v7.1.0.
```
input {
jdbc {
jdbc_connection_string => "jdbc:sqlserver://DB01:1433;databasename=mydb;integratedSecurity=true"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_driver_library => "C:\Program Files\sqljdbc_6.2\enu\mssql-jdbc-6.2.2.jre8.jar"
jdbc_user => nil
statement => "SELECT * from data WHERE updated_on > :sql_last_value ORDER BY updated_on"
use_column_value =>true
tracking_column =>updated_on
tracking_column_type => "timestamp"
}
}
output {
elasticsearch { hosts => ["localhost:9200"]
index => "datau"
action=>update
document_id => "%{id}"
doc_as_upsert =>true}
stdout { codec => rubydebug }
}
Here's the result when i run the conf file in Logstash:
D:\logstash-7.1.0\logstash-7.1.0\bin>logstash -f update.conf
Sending Logstash logs to D:/logstash-7.1.0/logstash-7.1.0/logs which is now conf
igured via log4j2.properties
[2019-08-23T11:17:32,538][WARN ][logstash.config.source.multilocal] Ignoring the
'pipelines.yml' file because modules or command line options are specified
[2019-08-23T11:17:32,572][INFO ][logstash.runner ] Starting Logstash {"
logstash.version"=>"7.1.0"}
[2019-08-23T11:17:45,784][INFO ][logstash.outputs.elasticsearch] Elasticsearch p
ool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-08-23T11:17:46,121][WARN ][logstash.outputs.elasticsearch] Restored connec
tion to ES instance {:url=>"http://localhost:9200/"}
[2019-08-23T11:17:46,218][INFO ][logstash.outputs.elasticsearch] ES Output versi
on determined {:es_version=>7}
[2019-08-23T11:17:46,224][WARN ][logstash.outputs.elasticsearch] Detected a 6.x
and above cluster: the `type` event field won't be used to determine the documen
t _type {:es_version=>7}
[2019-08-23T11:17:46,290][INFO ][logstash.outputs.elasticsearch] New Elasticsear
ch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:920
0"]}
[2019-08-23T11:17:46,323][INFO ][logstash.outputs.elasticsearch] Using default m
apping template
[2019-08-23T11:17:46,389][INFO ][logstash.javapipeline ] Starting pipeline {:
pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipelin
e.batch.delay"=>50, "pipeline.max_inflight"=>250, :thread=>"#<Thread:0xd4fd1 run
>"}
[2019-08-23T11:17:46,521][INFO ][logstash.outputs.elasticsearch] Attempting to i
nstall template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>6
0001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mapp
ings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match
_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"strin
g_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"t
ext", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>
256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"
keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "loca
tion"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{
"type"=>"half_float"}}}}}}}
[2019-08-23T11:17:47,479][INFO ][logstash.javapipeline ] Pipeline started {"p
ipeline.id"=>"main"}
[2019-08-23T11:17:48,505][INFO ][logstash.agent ] Pipelines running {:
count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-08-23T11:17:50,356][INFO ][logstash.agent ] Successfully started
Logstash API endpoint {:port=>9600}
[2019-08-23T11:17:51,433][INFO ][logstash.inputs.jdbc ] (0.080779s) SELECT *
from data WHERE updated_on > '2019-08-24T00:00:00.000' ORDER BY updated_on
D:/logstash-7.1.0/logstash-7.1.0/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.
7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnu
m is deprecated
{
"id" => 34,
"updated_on" => 2019-08-24T10:55:00.000Z,
"@timestamp" => 2019-08-23T11:17:51.589Z,
"name" => "SK",
"@version" => "1"
}
[2019-08-23T11:17:55,105][INFO ][logstash.runner ] Logstash shut down.
```
Where am i going wrong?