Pushing data to a kafka server

AI am trying to load data into a kafka but it's not working somehow

I am having this message when running my logstash conf

[DEBUG][org.apache.kafka.clients.NetworkClient][main] [Producer clientId=producer-1] Sending metadata request MetadataRequestData(topics=, allowAutoTopicCreation=true, includeClusterAuthorizedOperations=false, includeTopicAuthorizedOperations=false) to node xxxxxxxxxx.xxxxxx.xxxxx:9092 (id: 11 rack: null)
[2019-12-04T14:25:39,308][DEBUG][org.apache.kafka.clients.NetworkClient][main] [Producer clientId=producer-1] Using older server API v5 to send METADATA {topics=,allow_auto_topic_creation=true} with correlation id 61 to node 11
[2019-12-04T14:25:39,314][DEBUG][org.apache.kafka.clients.Metadata][main] [Producer clientId=producer-1] Updated cluster metadata updateVersion 56 to MetadataCache{cluster=Cluster(id = xxxxxxxx, nodes = [xxxxx.xx.xx:9092 (id: 9 rack: null), xx.xx.kn:9092 (id: 11 rack: null), xxx.xxx.xxx:9092 (id: 10 rack: null), xxxx.xx.xx:9092 (id: 8 rack: null)], partitions = , controller = xx.xx.xxx:9092 (id: 9 rack: null))}
[2019-12-04T14:25:40,098][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-12-04T14:25:40,099][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-12-04T14:25:41,345][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2019-12-04T14:25:45,103][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-12-04T14:25:45,104][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-12-04T14:25:46,345][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.

Here my conf scripts:


input {

          jdbc {
                jdbc_driver_library => "/home/XXXXX.XXXX/elk/logstash-7.5.0/logstash-core/lib/jars/ojdbc8.jar"
                jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
                jdbc_connection_string => "jdbc:oracle:thin:@//sf-ds.sd.fv.ff:1521/dd.ddddd.com"
                jdbc_user => "dddddd"
                jdbc_password => "HHHgggggg"
                parameters => {"CUSTOMER_CODE1" => "XXXX"}
                parameters => {"CUSTOMER_CODE2" => "YYYY"}
                parameters => {"CUSTOMER_CODE3" => "YYYY"}
                statement_filepath => "/home/EEE.ccc/elk/RRR.sql"
                tracking_column => "O_H_ID"
                jdbc_paging_enabled => "true"
                jdbc_page_size => "50000"
                tracking_column_type => "numeric"
                schedule => "* * * * *"
                clean_run => true
                last_run_metadata_path => "/path/.logstash_jdbc_last_run"

        }

}

output {
       kafka {
              bootstrap_servers => "fdfg-exterva-test-1.sa.qq:9092"
             topic_id => "o_h"

            }
       }