Dear Team,
KIndly find my losgtash conf file:
input {
jdbc {
jdbc_validate_connection => true
jdbc_connection_string => "jdbc:oracle:thin:@10.1.50.79:1521/ndaie2"
jdbc_user => "AXIA_SPRINT_DEV"
jdbc_password => "AXIA_SPRINT_DEV"
jdbc_fetch_size => 2000
#jdbc_paging_enabled => true
#jdbc_page_size => 20000
jdbc_driver_library => "D:\Apeksha\logstash-5.4.0\Oracle_JDBC_Driver\ojdbc7.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
statement => "select id, name, city, to_char(dater, 'DD-MON-YYYY HH12:MI:SS') as dater_time from logstash_try WHERE to_char(dater, 'DD-MON-YYYY HH12:MI:SS') > to_char(CURRENT_DATE - interval '7' day, 'DD-MON-YYYY HH12:MI:SS')
AND to_char(dater, 'DD-MON-YYYY HH12:MI:SS') > TO_CHAR(TO_DATE(:sql_last_value ,'DD-MON-YYYY HH12:MI:SS'), 'DD-MON-YYYY HH12:MI:SS') ORDER BY dater_time"
use_column_value => true
tracking_column => "dater_time"
tracking_column_type => "timestamp"
#clean_run => true
jdbc_paging_enabled => "true"
jdbc_page_size => "50000"
jdbc_default_timezone => "Asia/Kolkata"
last_run_metadata_path => "C:\Users\apeksha.bhandari\.logstash_try_001"
schedule => "*/5 * * * * *"
}
}
output {
stdout { codec => json }
elasticsearch {
hosts => ["10.1.54.76:9200"]
index => "india_30"
document_id => "%{id}"
retry_on_conflict => 3
}
file {
codec => json_lines
path => "D:\Apeksha\logstash-5.4.0\india_30.log"
}
}
Issue is that after running query once, logstash stores these entries somewhere in cache, and in the next scheduled query hit, doesnt take the updated rows. Is there any way to clean this cache?
Same queries when run in oracle give correct output
Regards,
Apeksha