Question 1 : If we have multiple nodes, do we have any Load Balancer component in ES stack, which decides which node is ideal to take the request?
No. It just goes to the node you connected to.
Coordinating node creates an empty priority queue and then forwards the request to every shard in the index.
I'd say not to every shard but one copy of each shard of the index.
Question 2 : If we have multiple indices, does coordinating node forwards the request to all indices parallelly or sequentially?
Question 3 : What if multiple queries land on a specific shard, does it create individual queue for each request? How many such parallel queries can be handled by a specific shard?
Yes and I don't know.
Question 4 : When we have multiple requests coming into ES, does it maintain any queue for these incoming requests?
Question 5 : How many parallel threads at the node level and as a whole, ES can spawn to cater these incoming requests? Does ES have any limit?
I don't really know and yes there are limits. Have a look at https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-threadpool.html