Facing Issues with Logstash while Practicing Incremental Fetch over SQL Datasets using JDBC Plugin

I have already tried loading Sql Data sets into Logstash it was working fine even also with timestamp fields. But I have been looking for Dynamic Loading (Incremental Fetch). I have a table 'Kibana_ch' which has Three fields.About Incremental Fetch - whenever I try to modify the table (Insert/Update), data has to be fetched automatically into Logstash and it should be available in Elasticsearch so that I can Visualize it over Kibana. I have followed the documentation about JDBC Plugins and I have created a config file as per my need which doesn't have any Warnings at all. But the problem is, Data is not getting loaded into Logstash. Logstash runs continuously without termination, **First Issue is, since logstash runs without termination, I was not able to create any Index **
This is how it keeps on pinging continuously,
D:\elk\logstash-5.2.1\bin>logstash -f '..\config\logstash-db_try.conf' Using JAVA_HOME=C:\Program Files\Java\jre1.8.0_65 retrieved from C:\ProgramData\ Oracle\java\javapath\java.exe [2017-03-06T17:55:52,753][INFO ][logstash.pipeline ] Starting pipeline {" id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch. delay"=>5, "pipeline.max_inflight"=>500} [2017-03-06T17:55:52,805][INFO ][logstash.pipeline ] Pipeline main starte d [2017-03-06T17:55:52,921][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2017-03-06T17:56:00,363][INFO ][logstash.inputs.jdbc ] (0.021000s) SELECT * FROM (SELECT count(*) "COUNT" FROM (select today_Date from kibana_ch where toda y_Date > TIMESTAMP '2017-03-06 09:57:24.791000 +00:00') "T1") "T1" WHERE (ROWNUM <= 1) [2017-03-06T17:57:00,152][INFO ][logstash.inputs.jdbc ] (0.001000s) SELECT * FROM (SELECT count(*) "COUNT" FROM (select today_Date from kibana_ch where toda y_Date > TIMESTAMP '2017-03-06 09:57:24.791000 +00:00') "T1") "T1" WHERE (ROWNUM <= 1) [2017-03-06T18:17:19,875][INFO ][logstash.inputs.jdbc ] (0.001000s) SELECT * FROM (SELECT count(*) "COUNT" FROM (select today_Date from kibana_ch where toda y_Date > TIMESTAMP '2017-03-06 09:57:24.791000 +00:00') "T1") "T1" WHERE (ROWNUM <= 1) [2017-03-06T18:17:19,882][INFO ][logstash.inputs.jdbc ] (0.001000s) SELECT * FROM (SELECT count(*) "COUNT" FROM (select today_Date from kibana_ch where toda y_Date > TIMESTAMP '2017-03-06 09:57:24.791000 +00:00') "T1") "T1" WHERE (ROWNUM <= 1) [2017-03-06T18:17:19,891][INFO ][logstash.inputs.jdbc ] (0.001000s) SELECT * FROM (SELECT count(*) "COUNT" FROM (select today_Date from kibana_ch where toda y_Date > TIMESTAMP '2017-03-06 09:57:24.791000 +00:00') "T1") "T1" WHERE (ROWNUM <= 1) [2017-03-06T18:17:19,897][INFO ][logstash.inputs.jdbc ] (0.001000s) SELECT * FROM (SELECT count(*) "COUNT" FROM (select today_Date from kibana_ch where toda y_Date > TIMESTAMP '2017-03-06 09:57:24.791000 +00:00') "T1") "T1" WHERE (ROWNUM <= 1) [2017-03-06T18:17:19,904][INFO ][logstash.inputs.jdbc ] (0.001000s) SELECT * FROM (SELECT count(*) "COUNT" FROM (select today_Date from kibana_ch where toda y_Date > TIMESTAMP '2017-03-06 09:57:24.791000 +00:00') "T1") "T1" WHERE (ROWNUM <= 1) [2017-03-06T18:17:19,918][INFO ][logstash.inputs.jdbc ] (0.001000s) SELECT * FROM (SELECT count(*) "COUNT" FROM (select today_Date from kibana_ch where toda y_Date > TIMESTAMP '2017-03-06 09:57:24.791000 +00:00') "T1") "T1" WHERE (ROWNUM <= 1) [2017-03-06T18:17:19,925][INFO ][logstash.inputs.jdbc ] (0.001000s) SELECT * FROM (SELECT count(*) "COUNT" FROM (select today_Date from kibana_ch where toda y_Date > TIMESTAMP '2017-03-06 09:57:24.791000 +00:00') "T1") "T1" WHERE (ROWNUM <= 1)
For reference, I also have attached the logstash Config file
input { jdbc { jdbc_driver_library => "D:\elk\ojdbc6.jar" jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver" jdbc_connection_string => "jdbc:oracle:thin:@//tHANGA:1526/YUIO" jdbc_user => "thanga" jdbc_password => "welcomethanga#" jdbc_validate_connection => "true" schedule => "* * * * *" statement => "select * from kibana_ch where today_Date > :sql_last_value" use_column_value => true tracking_column => today_Date last_run_metadata_path => "/Users/thangarajm/.logstash_jdbc_last_run" jdbc_paging_enabled => "true" jdbc_page_size => "1" } } output { elasticsearch { document_id=> "%{today_date}" index => "fetch12" hosts => "127.0.0.1:9200" } stdout { codec => json } }
The above config file doesn't have any warnings but still
I was not able to create my Index.
Data is not getting loaded into Logstash.
Not able to view in Elasticsearch since the Index was not created.
Kibana Is visualizing some random data which lies in between the data which we are adding into.

Caught up with these things..
Suggestions and hands on would be more appreciable.
Thank you in Advance.
Here is the structure of the table,

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.