Logstash goes down || Solution

Hello Team,

I am trying to push data from oracle to ES and I am able to do that as well. My problem when I am pushing those data and that time my ES goes down then I am not able to figure out that from where I need to push that data because my last_run_metadata file also gets updated.

Thanks

I resolve the above issue, Please find the below solution script:-
input {
jdbc {
jdbc_driver_library => "C:/ELK/logstash-7.3.1/logstash-core/lib/jars/postgresql-42.2.8.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://:/?ssl=false"
jdbc_user => ""
jdbc_password => ""
jdbc_paging_enabled => "true"
jdbc_page_size => "50000"
schedule => "* * * * *"
statement => "select * from aiml_test_rk where defect_id > :sql_last_value"
use_column_value => true
tracking_column => defect_id
record_last_run => true
last_run_metadata_path =>"C:/ELK/logstash_jdbc_last_run_t_data.txt"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
action => "index"
hosts => ["127.0.0.1:9200"]
index => "recovery_post"
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.