Send data to multiple destinations - store data in multiple kafka topics?

The first option seems to be to send events either to the first kafka topic, or the second. Clearly you have some conditional there. You could conditionalize the output so that everything goes to the first kafka topic, and only the events that satisfy the condition go to the second topic. Then the two pipelines would each copy everything from their topic to their elasticsearch instance.

Not sure if that helps, since, to be honest, I do not understand your concern about prioritization of partitions. Why do you need multiple partitions? Is the data volume so high that you have problems keeping up?