Logstash JDBC unable to run multiple statements and ingest data from different tables

There is a couple of issues here.

First, separating your config in multiple files will not make logstash run them as different pipelines if you do not use pipelines.yml, which you seem to not be using as you aren't binding mounting the pipelines.yml, check this question for an example.

You would need to have a file like this:

- pipeline.id: pipeline-1
  path.config: "/usr/share/logstash/pipeline/logstash1.conf"
- pipeline.id: pipeline-2
  path.config: "/usr/share/logstash/pipeline/logstash2.conf"

And bind mount this file as /usr/share/logstash/config/pipelines.yml

If you do not do that, logstash will merge both files and run as a single pipeline with the name main, it would be the same thing as having just a single configuration.

I suggest that you create this file and mount it as pipelines.yml to make logstash run your configuration as two different pipelines.

The id field on a filter is optional and mostly used to troubleshoot some performance issues, you may remove it if you want.

You should also remove this from your logstash.yml, this is used when you have Centralized Pipeline Management configured and configure all your pipelines with Kibana, this is a paid feature that will only work if you have at least a platinum license.

As mentioned before, you are not running two independent pipelines, you are running just one pipeline which is a merge of your two configurations, so this is expected as you do not have any conditionals in your output.

I see no errors in your jdbc configuration, so the only things that I can think for it to not work are:

  • There is no data being returned for your query.
  • There is data being returned but for some reason it has some conflict with the data returned by the first jdbc query, which would make elasticsearch reject the document, but this would generate a log and you didn't share anything about it.

Can you share a sample of the data returned by both of those queries?

The fact that the second jdbc filter is not creating the last run metadata path suggests that it is not returning any data.

Also, can you share your logstash logs when you start your docker-compose to see if it is generating any WARN/ERROR lines?