JDBC input is not working when added with Kafka input

Hello Team,

I am trying to configure multiple inputs in a pipeline

input {
  jdbc {
  }
  kafka {
  }
  kafka {
  }
}

but when i run the pipeline only kafka messages i am getting and jdbc input is not working.
Could you please help me understand if we can have different types of inputs in single pipeline.

Thank you ,
Aditya

You should consider setting up multiple pipelines.

Thank you @AquaX for your reply.
Yes, i am considering that, just one thing, I want to reuse same filter and ouput section,, even I am looking at sending data from one pipeline to another pipeline , but it is again just taking one input section to another file, so was thinking of adding jdbc section to the same input section.
any thoughts, on this approach

I think the problem is that the kafka and jdbc input plugins both work differently. The JDBC works on a schedule while the kafka does not.
I'm sure you can get them to work but you may need to set a schedule for the JDBC plugin and then wait for that to execute.
The pipeline-pipeline communication could work as well. Check out this example:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.