Logstash reading CLOB Data from oracle

Hi
I want to read CLOB data type column with nested json in it from source oracle table and write to ES using the logstash. I have configured the configuration file with input 4 JDBC connections on same table based on partition column . The total number of records in table are around 15 million , I observe that the read & write is very slow , it was just processing 600 events/sec and also noticed that if the input oracle column data doesn't have CLOB datatype with non nested then the read and write is super fast . Hence, how to make logstash read clob datatype column from oracle faster and write faster to ES index.Any suggestions is highly Appreciated.

THANKS IN ADVANCE FOR YOU

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.