I am trying to import data from oracle db to elastic search, the process is happening with logstash in windows but when i run the same in a unix box it fetches the data from table then says [Error] [logstash.javapipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <Logstash::Inputs::Jdbc jdbc_user=>""...
Error: No such file or directory --/.logstash_jdbc_last_run
Exception: Errno::ENOENT
Stack: org/jruby/RubyIO.java:1236:in sysopen' org/jruby/RubyIO.java:3796:in
write'
Conf file:
input {
jdbc {
jdbc_connection_string => 'jdbc:oracle:thin:@<host>:<port>/<service name>'
jdbc_user => '<user>'
jdbc_password=>'<DBpassword>'
jdbc_driver_library=>'<absolute path to ojdbc8.jar>'
jdbc_driver_class=>'Java::oracle.jdbc.OracleDriver'
statement => 'select * from mytable'
}
}
output {
stdout {
codec => json_lines
}
}
I didnot understand the points that with the same downloaded logstash-7.2.0 why the same conf file runs good on windows while it runs in an infinite loop in unix RHEL 7 server boxes