Elasticsearch Transport client performance vs. Sense plugin

I am trying to understand why the performance of the response from my transport client execution in milliseconds is almost an entire order of magnitude slower than the response I get when executing the same query in the Sense plugin.

Lets say I run a very basic query in Sense like:

GET /my_index/my_type/_search
"query": {
"bool": {}

On a set of 30 million records I am getting response times like 250-500 ms.

When I run the same thing with a java block like this:

SearchResponse response = client.prepareSearch("my_index")

System.out.println(response .getTookInMillis());

I get response times ranging from 2000-5000ms

Where client is a transport client with default settings and the queryBuilder is an empty BoolQueryBuilder (thus the same query as above). I have only added a timeout and custom cluster name to the client settings.

My cluster is just 2 master nodes.

Any ideas where this large performance gap could be occuring?

Why do you use slow DFS_QUERY_THEN_FETCH?

No reason - I had tried switching to just QUERY_THEN_FETCH and did not see any noticeable improvement.

Do not set the search type, it confuses your measurements.

HTTP query

POST /index/_search
  "query": {
    "bool": {}

has exact same search 'took' time like

SearchResponse response = client.prepareSearch()

Check your network settings (host names, DNS resolution, routing) if they produce lags in your Java client. Java is very picky about that.

I will check out my network settings. I am using multiple types in my index and specifying that type in my java search - thus i wold expect my measurements to be consistent.

I am facing the same issue . When I use curl to get response , it takes around 40-50 ms . But When same query is executed using execute().actionGet() using TransportClient , it takes around 1000 ms. Trying to figure out the root cause and the solution of this problem . Could anybody pls help me out with this

Please don't send the same question multiple times. Does not help.