Update is not Working in Logstash JDBC Input

Hi
I am working on Elastic Search using logstash conf file using sql query. I am connecting Sql Database using Logstash conf File. This is my Conf file

input {
jdbc {
jdbc_driver_library => "D:\Elastic\elasticsearch-6.0.0\lib\sqljdbc42.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://192.168.0.168;databaseName=Sample;"
jdbc_user => "sa"
jdbc_password => "1234"
statement => "select Top 100 id,FName,LName,Email from dbo.AddUser WHERE id > :sql_last_value"
schedule => "* * * * *"
use_column_value => true
tracking_column => "id"
tracking_column_type => "numeric"
clean_run => true

}
}

output {
elasticsearch {
hosts => "localhost:9200"
index => "users"
document_id => "%{id}"
document_type => "user"
manage_template => true
}
stdout { codec => rubydebug }
}

My Table Column Details
<id, int>
<FName, varchar(200),>
,<LName, varchar(80),>
,<Email, varchar(80),>
,<Password, varchar(50),>
,<CreatedDate, datetime,>
,<LastUpdatedDate, datetime,>
,<Status, int,>
,<Resid, int,>
,<Budget, int,>
,<Dailsales, int,>
,<Reports, int,>
,<DayProfit, int,>
,<View, varchar(20),>
,<Boss, int,>
,<usersfrom, varchar(50),>)

When run this using logstash -f JDBCConnector.conf --debug .. it fetches all the records correctly.
When i Insert new record into Database, it fetches the new record correctly due to Schedule command in Conf File.

If i Update any records in Database , its not reflecting in my elastic search Output.

What mistake i did here?? I am new to elastic search and Logstash.

Please help...

You're explicitly only selecting rows with an id that's higher than what it was the last time so the observed behavior is expected. If you want to catch old rows that have been updated you have two choices:

  • Fetch all documents every time.
  • Use a "last modified" column in the database that you can use instead of the id to select only updated rows.
2 Likes

Hi Magnus,

Thanks for Reply. First i go with your first choice

Fetch all documents every time.

so i modified query like this

select * from dbo.AddUser WHERE id > :sql_last_value

But its not reflecting updated value in ES Output.

If i Go with with your second choice

Use a "last modified" column in the database that you can use instead of the id to select only updated rows.

In my database , all tables dont have last modified column..only some tables having last modified column.

So how can i achieve that without adding last modified column in all tables.. to get updated record.???

But its not reflecting updated value in ES Output.

??? You're still only fetching rows with ids higher than the last seen value, i.e. only new rows. Drop the WHERE clause.

So how can i achieve that without adding last modified column in all tables.. to get updated record.???

Logstash can't magically figure out which rows have changed. You need a "last modified" column (or something equivalent; a monotonically increasing integer field would work too).

Thanks its working.

I have doubt , I am trying to use SYS_TABLE in SQL Server . If there is any change in table i am going to fetch Last modified date and time

for example.

SELECT *
FROM sys.dm_db_index_usage_stats
WHERE database_id = DB_ID( 'Fobesoft_05Dec2017')
AND OBJECT_ID=OBJECT_ID('dbo.AddUser')"

this query will return output of

last_user_update 2017-12-19T10:30:34.547Z

Based on these output i need to fetch all the records of that table

i.e, select * from dbo.AddUser

If last_user_update time is changed . i need to run this query and need to fetch all the updated records

My question is..
is there any if condition i can use this JDBC Input Plugin??

Is it possible to run another conf file from one conf file based on condition??

I don't think that's possible.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.