Kibana working very slow and showing errors randomly

My kibana is working very slow. I only have 4 indices in it with 300, 45000, 66000 and 12000 documents respectively.
Also once I start the server, after some time it gives circuit break exception.
This did not happen in the beginning when I started using ElasticSearch and Kibana in Jan 2020.
Why is this happening??
What should I do to rectify this situation?

Could you share the elasticsearch logs?

How do i share the file?

You can copy its content here and format with markdown or </> icon.
If too big, share it on gist.github.com and put the link here.

I'm surprised that I can't see any "circuit break exception" in the logs.
Could you share the Kibana logs or a screenshot when this is happening?

You can see here that when I'm searching for word: deadlines in field: body, I'm getting all 300 documents as a result (even the ones that don't contain that word or its synonyms). Also with this operation, I get an internal server error, as you can see at the top.

This is what it shows then on my cmd where I started my kibana server.

Also, this is my mapping for body field:

    "body" : {
              "type" : "text",
              "analyzer" : "synonym_analyzer"
            }

And this is my synonym_analyzer:

    "filter" : {
            "synonym_filter" : {
              "type" : "synonym",
              "synonyms_path" : "wn_s.txt",
              "updateable" : "true"
            }
          },
          "analyzer" : {
            "synonym_analyzer" : {
              "filter" : [
                "lowercase",
                "synonym_filter"
              ],
              "tokenizer" : "standard"
            }
          }

Here wn_s.txt is the sense dictionary of wordnet

log   [09:38:18.661] [info][listening] Server running at http://localhost:5601
 error  [11:08:06.495]  [circuit_breaking_exception] [parent] Data too large, data for [<http_request>] would be [994882004/948.7mb], which is larger than the limit of [986061209/940.3mb], real usage: [994880832/948.7mb], new bytes reserved: [1172/1.1kb], with { bytes_wanted=994882004 & bytes_limit=986061209 & durability="PERMANENT" } :: {"path":"/_msearch","query":{"rest_total_hits_as_int":"true","ignore_throttled":"true"},"body":"{\"index\":\"idxquesanswer\",\"ignore_unavailable\":true,\"preference\":1583406382581}\n{\"version\":true,\"size\":500,\"sort\":[{\"_score\":{\"order\":\"desc\"}}],\"_source\":{\"excludes\":[]},\"stored_fields\":[\"*\"],\"script_fields\":{},\"docvalue_fields\":[{\"field\":\"creationDate\",\"format\":\"date_time\"}],\"query\":{\"bool\":{\"must\":[],\"filter\":[{\"bool\":{\"should\":[{\"match\":{\"body\":\"deadlines\"}}],\"minimum_should_match\":1}}],\"should\":[],\"must_not\":[]}},\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"fragment_size\":2147483647},\"timeout\":\"30000ms\"}\n","statusCode":429,"response":"{\"error\":{\"root_cause\":[{\"type\":\"circuit_breaking_exception\",\"reason\":\"[parent] Data too large, data for [<http_request>] would be [994882004/948.7mb], which is larger than the limit of [986061209/940.3mb], real usage: [994880832/948.7mb], new bytes reserved: [1172/1.1kb]\",\"bytes_wanted\":994882004,\"bytes_limit\":986061209,\"durability\":\"PERMANENT\"}],\"type\":\"circuit_breaking_exception\",\"reason\":\"[parent] Data too large, data for [<http_request>] would be [994882004/948.7mb], which is larger than the limit of [986061209/940.3mb], real usage: [994880832/948.7mb], new bytes reserved: [1172/1.1kb]\",\"bytes_wanted\":994882004,\"bytes_limit\":986061209,\"durability\":\"PERMANENT\"},\"status\":429}"}
at respond (C:\Users\Ravinder\Desktop\chatbot\kibana-7.1.1-windows-x86_64\node_modules\elasticsearch\src\lib\transport.js:308:15)
at checkRespForFailure (C:\Users\Ravinder\Desktop\chatbot\kibana-7.1.1-windows-x86_64\node_modules\elasticsearch\src\lib\transport.js:267:7)
at HttpConnector.<anonymous> (C:\Users\Ravinder\Desktop\chatbot\kibana-7.1.1-windows-x86_64\node_modules\elasticsearch\src\lib\connectors\http.js:166:7)
at IncomingMessage.wrapper (C:\Users\Ravinder\Desktop\chatbot\kibana-7.1.1-windows-x86_64\node_modules\elasticsearch\node_modules\lodash\lodash.js:4935:19)
at IncomingMessage.emit (events.js:194:15)
at endReadableNT (_stream_readable.js:1103:12)
at process._tickCallback (internal/process/next_tick.js:63:19)

If it's unclear in the second screenshot, you can see here what it shows.

Hard to know. May be increase the heap allocated to elasticsearch? You are using only 1gb. May be 2gb would help.

I updated the heap size to 2 GB, still the problem isn't solved.

What is the error message? What are elasticsearch logs?

Please don't post images of text as they are hard to read, may not display correctly for everyone, and are not searchable.

Instead, paste the text and format it with </> icon or pairs of triple backticks (```), and check the preview window to make sure it's properly formatted before posting it. This makes it more likely that your question will receive a useful answer.

How large are your documents? How much space do they take up on disk? Are you using patent-child or nested documents? How large is the synonyms file?

I don't know the exact size of my documents, but they contain 8 to 24 fields (I have 4 indices, so different for each index), and data stored in those fields are of type string.

Also, yes I'm using parent-child relationships is one of my index. And it is only when I insert data in my this index when it throws Circuit_break_exception which says "Data too large".

My Synonym file is 7527 kb in size.

Please reply what should i do??

Read this and specifically the "Also be patient" part.

It's fine to answer on your own thread after 2 or 3 days (not including weekends) if you don't have an answer.

And please reply to the question I also asked. Thanks

I deeply regret being impatient. I apologize for that David.

Here are my elastic search logs:

And this is what I get at my cmd where I initialized kibana server:

log   [09:38:18.661] [info][listening] Server running at http://localhost:5601
error  [11:08:06.495]  [circuit_breaking_exception] [parent] Data too large, data for [<http_request>] would be [994882004/948.7mb], which is larger than the limit of [986061209/940.3mb], real usage: [994880832/948.7mb], new bytes reserved: [1172/1.1kb], with { bytes_wanted=994882004 & bytes_limit=986061209 & durability="PERMANENT" } :: {"path":"/_msearch","query":{"rest_total_hits_as_int":"true","ignore_throttled":"true"},"body":"{\"index\":\"idxquesanswer\",\"ignore_unavailable\":true,\"preference\":1583406382581}\n{\"version\":true,\"size\":500,\"sort\":[{\"_score\":{\"order\":\"desc\"}}],\"_source\":{\"excludes\":[]},\"stored_fields\":[\"*\"],\"script_fields\":{},\"docvalue_fields\":[{\"field\":\"creationDate\",\"format\":\"date_time\"}],\"query\":{\"bool\":{\"must\":[],\"filter\":[{\"bool\":{\"should\":[{\"match\":{\"body\":\"deadlines\"}}],\"minimum_should_match\":1}}],\"should\":[],\"must_not\":[]}},\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"fragment_size\":2147483647},\"timeout\":\"30000ms\"}\n","statusCode":429,"response":"{\"error\":{\"root_cause\":[{\"type\":\"circuit_breaking_exception\",\"reason\":\"[parent] Data too large, data for [<http_request>] would be [994882004/948.7mb], which is larger than the limit of [986061209/940.3mb], real usage: [994880832/948.7mb], new bytes reserved: [1172/1.1kb]\",\"bytes_wanted\":994882004,\"bytes_limit\":986061209,\"durability\":\"PERMANENT\"}],\"type\":\"circuit_breaking_exception\",\"reason\":\"[parent] Data too large, data for [<http_request>] would be [994882004/948.7mb], which is larger than the limit of [986061209/940.3mb], real usage: [994880832/948.7mb], new bytes reserved: [1172/1.1kb]\",\"bytes_wanted\":994882004,\"bytes_limit\":986061209,\"durability\":\"PERMANENT\"},\"status\":429}"}
at respond (C:\Users\Ravinder\Desktop\chatbot\kibana-7.1.1-windows-x86_64\node_modules\elasticsearch\src\lib\transport.js:308:15)
at checkRespForFailure (C:\Users\Ravinder\Desktop\chatbot\kibana-7.1.1-windows-x86_64\node_modules\elasticsearch\src\lib\transport.js:267:7)
at HttpConnector.<anonymous> (C:\Users\Ravinder\Desktop\chatbot\kibana-7.1.1-windows-x86_64\node_modules\elasticsearch\src\lib\connectors\http.js:166:7)
at IncomingMessage.wrapper (C:\Users\Ravinder\Desktop\chatbot\kibana-7.1.1-windows-x86_64\node_modules\elasticsearch\node_modules\lodash\lodash.js:4935:19)
at IncomingMessage.emit (events.js:194:15)
at endReadableNT (_stream_readable.js:1103:12)
at process._tickCallback (internal/process/next_tick.js:63:19)

Also as you instructed earlier, I updated my heap size to 2 GB, But still the problem I'm facing remains unsolved.

The logs report that you still have 1gb of heap.
Check the installation guide on how to configure the heap and until the logs show you have 2gb, it's useless to run the same tests.

Thanks David
Yes the heap size was not updated earlier.
Now i updated it and the problem i was facing has been solved.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.