Performance of Elasticsearch based on number of doc counts

Hi, I am using Elasticsearch 2.2.0. I want to know that, will Elasticsearch compromise with the search performance if we index data on a single index with doc counts of more than 30,00,000 events. (doc counts can be more than 30,00,000).

If Elasticsearch compromises with the performance, then what will be the best way to index data as search performance is our highest priority.

Queries that will be applied on this data, will have two level or three level aggregations and filtering on multiple sourcetypes.

I'm not sure what you're asking here. Will Elasticsearch be slower the more documents you have in the index? Yes, of course. Will it matter in your case? That depends on many factors.

Have you taken any measures to find out? What configuration (hardware, software) do yo use? What do you mean by "the performance" - throughput? peak performance? average search time?

We have hardware configuration as follows:
8 core CPU, 64 GB RAM, ES_HEAP_SIZE variable set to 31 GB

I am interested in "Average Search Time" in performance.

Moreover, We are upgrading Elasticsearch from 2.2.0 to 2.3. Will this step, results in fast execution of queries??

Thanks