Manage pipeline launch order

Hi, I have a project with 8 pipelines that handle different data, some depend on the end of execution of other pipelines. Is there a way to schedule their execution and wait for some to finish executing before starting others ?
Thanks in advance

Hello,

Why dont you make the output of the first pipeline going into the input of the second ?

Can you maybe be more specific about your use case ?

Thanks for your reply @grumo35

To give more details, I have 7 pipelines, each one has a jdbc input which each query a different database and has an Elasticsearch index as output. Some of these pipelines have Elasticsearch filters that query an index previously created by another pipeline (hence the interest of waiting for the end of execution). Finally, I have a last transformation pipeline which has in input one of the previously created index and enriches the event with data from other indexes (via Elasticsearch filters) and in output calls an http service.

Oh that's sound complicated.

I'm not used to work with jdbc plugins, i assume there would be a way to discriminate each indexes as they should carry database names so would it be possible to check for the end of datastream (input) by let's say checking if the delta of the current time is +2min ( no new data = pipeline finished ? )

The other way would be to use an external tools like n8n to trigger specific workflows and pipelines ?

Good luck !!

I don't know n8n, I'm going to study this option...

I was also thinking of using an sqs file as output because each source has a common data, which would allow me to use it in my where clause (if I can use a dynamic value in an input but I don't think so).

Thank you very much