Jdbc input not writing index

Hello,
here is my sample input.conf:

input {
        jdbc {
                jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
                jdbc_connection_string => "jdbc:sqlserver://x.x.x.x;databaseName=DB;user=user;password=pass;"
                jdbc_user => "user"
                jdbc_password => "pass"
                schedule => "45 17 * * *"
                statement =>" some query"
        }
}
filter{
        date
        {
               match => [ "SendTime", "yyyy-MM-dd" ]
               target => "SendTime"
        }
}
output{
stdout {codec => rubydebug}
elasticsearch{
                 hosts=>"x.x.x.x:9200"
                 index=>"index"
         }

}

the config works just fine if I run it as /usr/share/logstash/bin/logstash -f input.conf. and it successfully writing into index, but the problem startd when I add the path into the pipelines.yml. I have 4 running pipelines and this is one of it.

- pipeline.id: pipe1
  path.config: "/path1"
- pipeline.id: invoice_sends
  path.config: "/etc/logstash/input/input.conf"
- pipeline.id: pipe2
  path.config: "/path2"
- pipeline.id: pipe3
  path.config: "/path3"

when I check the log, no error shown.

[2021-07-05T10:27:08,830][INFO ][logstash.javapipeline    ][invoice_sends] Pipeline Java execution initialization time {"seconds"=>1.94}
[2021-07-05T10:27:09,201][INFO ][logstash.javapipeline    ][invoice_sends] Pipeline started {"pipeline.id"=>"invoice_sends"}

bottom of log

[2021-07-05T10:40:18,567][INFO ][logstash.agent           ] Pipelines running {:count=>4, :running_pipelines=>[pipe1, :email_traffic, pipe2, pipe3], :non_running_pipelines=>[]}

This is my first time running jdbc input, and I use the very minimal config. Am I missing some part to start it as service?

update

It runs but not by the scheduled time.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.