Thank you, here is my pipeline configuration :
input {
jdbc {
clean_run => true
jdbc_driver_library => "E:\ELK 8.10.2\logstash-conf\ojdbc6.jar"
jdbc_driver_class => "oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "*****"
jdbc_user => "****"
jdbc_password => "****"
jdbc_default_timezone => "Asia/Riyadh"
#schedule => "*/5 * * * *"
statement => "SELECT CHANNEL FROM CALL_HISTORY_DETAILS_VIEW FETCH FIRST ROW ONLY"
#use_column_value => true
#tracking_column => "CALLS"
tags => ["oraclelogger"]
}
}
filter {
date {
match => [ "CALL_STRAT_DATE_TIME", "yyyy/MM/dd HH:mm:ss" ]
timezone => "Asia/Riyadh"
target=> "@timestamp"
}
}
output{
elasticsearch {
hosts => ["http://****:9200/"]
index => "testlogger_index"
user => "elastic"
password => "****"
ssl => false
ssl_certificate_verification => false
}
}
I have also tried converting @timestamp itself to a new field (through filter) but it didnt work, tried different format also (UNIX/ISO) still not getting any results.
Tried selecting time column from DB with TO_CHAR function and converted to the format 'yyyy/MM/dd HH:mm:ss', tried converting UTC
CAST(
FROM_TZ(Call_Start_Time, 'Asia/Riyadh') AT TIME ZONE 'UTC' AS TIMESTAMP
) AS Call_Start_Time_UTC
This is getting really frustrating tbh ![]()