[elasticserach performance] match query very slowly!

  1. Search the keyword "ladder" use matchQuery (java api) it will be very very very slowly!
    Time consuming as follows:

{
"took": 7778,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"failed": 0
},
"hits": {
"total": 92,
"max_score": 15.208916,

  1. to do the matchquery full text search, the document is very large, the largest document about 2.5MB (return to the results)

  2. Heap usage record, three nodes: 10% (master node), 4% (routing node), 58% (data node).

Want to analyze what causes it? And how to optimize the query?

----------------Dividing line-------------------------------- A
The following is profile monitoring records.

"profile": {
"shards": [
{
"id": "[BosI3A-bSJaVM0LHULDC9A] [baike_index] [0]",
"searches": [
{
"query": [
{
"type": "TermQuery",
"description": "content: ladder"
"time": "0.5679210000ms",
"time_in_nanos": 567921,
"breakdown": {
"score": 4818,
"build_scorer_count": 23,
"match_count": 0,
"create_weight": 490622,
"next_doc": 5347,
"match": 0,
"create_weight_count": 1,
"next_doc_count": 21,
"score_count": 18,
"build_scorer": 67071,
"advance": 0,
"advance_count": 0
}
}
],
"rewrite_time": 379627,
"collector": [
{
"name": "CancellableCollector",
"reason": "search_cancelled",
"time": "0.03252600000ms",
"time_in_nanos": 32526,
"children": [
{
"name": "SimpleTopScoreDocCollector",
"reason": "search_top_hits",
"time": "0.01902000000ms",
"time_in_nanos": 19020
}
]
}
]
}
],
"aggregations":
},
{
"id": "[BosI3A-bSJaVM0LHULDC9A] [baike_index] [1]",
"searches": [
{
"query": [
{
"type": "TermQuery",
"description": "content: ladder"
"time": "0.8072510000ms",
"time_in_nanos": 807251,
"breakdown": {
"score": 5896,
"build_scorer_count": 29,
"match_count": 0,
"create_weight": 745418,
"next_doc": 7775,
"match": 0,
"create_weight_count": 1,
"next_doc_count": 16,
"score_count": 13,
"build_scorer": 48103,
"advance": 0,
"advance_count": 0
}
}
],
"rewrite_time": 508630,
"collector": [
{
"name": "CancellableCollector",
"reason": "search_cancelled",
"time": "0.04867000000ms",
"time_in_nanos": 48670,
"children": [
{
"name": "SimpleTopScoreDocCollector",
"reason": "search_top_hits",
"time": "0.02597900000ms",
"time_in_nanos": 25979
}
]
}
]
}
],
"aggregations":
},
{
"id": "[BosI3A-bSJaVM0LHULDC9A] [baike_index] [2]",
"searches": [
{
"query": [
{
"type": "TermQuery",
"description": "content: ladder"
"time": "0.8369420000ms",
"time_in_nanos": 836942,
"breakdown": {
"score": 8774,
"build_scorer_count": 31,
"match_count": 0,
"create_weight": 744262,
"next_doc": 11751,
"match": 0,
"create_weight_count": 1,
"next_doc_count": 29,
"score_count": 25,
"build_scorer": 72069,
"advance": 0,
"advance_count": 0
}
}
],
"rewrite_time": 458680,
"collector": [
{
"name": "CancellableCollector",
"reason": "search_cancelled",
"time": "0.06351200000ms",
"time_in_nanos": 63512,
"children": [
{
"name": "SimpleTopScoreDocCollector",
"reason": "search_top_hits",
"time": "0.03681100000ms",
"time_in_nanos": 36811
}
]
}
]
}
],
"aggregations":
},
{
"id": "[BosI3A-bSJaVM0LHULDC9A] [baike_index] [4]",
"searches": [
{
"query": [
{
"type": "TermQuery",
"description": "content: ladder"
"time": "0.9213890000ms",
"time_in_nanos": 921389,
"breakdown": {
"score": 8899,
"build_scorer_count": 28,
"match_count": 0,
"create_weight": 745401,
"next_doc": 11755,
"match": 0,
"create_weight_count": 1,
"next_doc_count": 29,
"score_count": 22,
"build_scorer": 155254,
"advance": 0,
"advance_count": 0
}
}
],
"rewrite_time": 508615,
"collector": [
{
"name": "CancellableCollector",
"reason": "search_cancelled",
"time": "0.07178900000ms",
"time_in_nanos": 71789,
"children": [
{
"name": "SimpleTopScoreDocCollector",
"reason": "search_top_hits",
"time": "0.04696100000ms",
"time_in_nanos": 46961
}
]
}
]
}
],
"aggregations":
},
{
"id": "[DbUASJSzTDWaLQocTYuKpA] [baike_index] [3]",
"searches": [
{
"query": [
{
"type": "TermQuery",
"description": "content: ladder"
"time": "0.3358920000ms",
"time_in_nanos": 335892,
"breakdown": {
"score": 6083,
"build_scorer_count": 26,
"match_count": 0,
"create_weight": 273080,
"next_doc": 4956,
"match": 0,
"create_weight_count": 1,
"next_doc_count": 19,
"score_count": 14,
"build_scorer": 51713,
"advance": 0,
"advance_count": 0
}
}
],
"rewrite_time": 265629,
"collector": [
{
"name": "CancellableCollector",
"reason": "search_cancelled",
"time": "0.02894000000ms",
"time_in_nanos": 28940,
"children": [
{
"name": "SimpleTopScoreDocCollector",
"reason": "search_top_hits",
"time": "0.01789500000ms",
"time_in_nanos": 17895
}
]
}
]
}
],
"aggregations":
}
]
}
}

This is surprising indeed. The query is simple (TermQuery) and matches few documents. It should be very fast. Does it reproduce all the time?

I already resolve it,because the document which i search is very very large and I use the plain method to highlight.
When I change the fvh highlight, the bug is go away.
Thanks!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.