Prioritising logs Using Kafka & Logstash

We have a deployment in which different types of logs are getting pushed to Kafka. Then logstash is pulling these logs from Kafka and pushing it to ES. We want to prioritise the way logs getting indexed in ES. For example logs of type A should be indexed faster than logs type B. Basically if there are too many logs of type B then they should not starve logs of type A.

So can some one guide me if the below deployment will achieve this.

Create two topics in Kafka let's say topic_A & topic_B. Let's say topic_A has 4 partitions and topic_B has 2 partitions. Have two kafka input in logstash one reading from topic_A with 4 consumer threads and the other one reading from topic_B with two consumer threads. or have two logstash instance running one reading from topic_A with 4 consumer threads and the other reading from topic_B with 2 consumer threads.

Having different topics and thread counts is probably the best way.

Thanks @warkolm.

Is multiple kafka inputs supported in a single logstash instance?

Yes you can.

Thanks @warkolm again.

Check this reply for details: