Logstash pipeline execution order


I am working on a sync mechanism between a DB2 database and Elasticsearch, I am transporting the data with the JDBC plugin.
However, I am running into the issue of the pipeline order, meaning:

  • I have 3 masterdata tables that I need to transport first
  • after that I want to transport my "main" table data and use the Elasticsearch "enrich" filter to query data from the previous 3

No matter how I declare my pipelines, they always run in the same time, I would like to avoid this somehow...
Is there a possibility to do this?

Not as far as I know.

I ended up using tags so I can differentiate the data and I use schedule to import them one after the other at specific times so I have everything in place.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.