Index has exceeded [1000000] - maximum allowed to be analyzed for highlighting

Hello,
I've came across strange problem when trying to get postgresql logs using Filebeats to elasticsearch and displaying it using kibana.

I have configured elasticsearch with kibana and logstash on one of my server and now I'd like to send postgresql logs from different sever to my elasticsearch instance.

Everything looks ok, when I run Filebeat it in fact get the data from postgresql logs, when I run request ".../_cat/indices?v" I see that docs.count is increasing but when I go into discover to see my data I get following errors:
Request to Elasticsearch failed: {"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"The length of [error.message] field of [kSXSJW0BtQYwaLb_TUKv] doc of [filebeat-7.3.1-2019.09.12-000001] index has exceeded [1000000] - maximum allowed to be analyzed for highlighting. This maximum can be set by changing the [index.highlight.max_analyzed_offset] index level setting. For large texts, indexing with offsets or term vectors is recommended!"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"filebeat-7.3.1-2019.09.12-000001","node":"24QHEmnTSYy_R8XnLUSLqw","reason":{"type":"illegal_argument_exception","reason":"The length of [error.message] field of [kSXSJW0BtQYwaLb_TUKv] doc of [filebeat-7.3.1-2019.09.12-000001] index has exceeded [1000000] - maximum allowed to be analyzed for highlighting. This maximum can be set by changing the [index.highlight.max_analyzed_offset] index level setting. For large texts, indexing with offsets or term vectors is recommended!"}}],"caused_by":{"type":"illegal_argument_exception","reason":"The length of [error.message] field of [kSXSJW0BtQYwaLb_TUKv] doc of [filebeat-7.3.1-2019.09.12-000001] index has exceeded [1000000] - maximum allowed to be analyzed for highlighting. This maximum can be set by changing the [index.highlight.max_analyzed_offset] index level setting. For large texts, indexing with offsets or term vectors is recommended!","caused_by":{"type":"illegal_argument_exception","reason":"The length of [error.message] field of [kSXSJW0BtQYwaLb_TUKv] doc of [filebeat-7.3.1-2019.09.12-000001] index has exceeded [1000000] - maximum allowed to be analyzed for highlighting. This maximum can be set by changing the [index.highlight.max_analyzed_offset] index level setting. For large texts, indexing with offsets or term vectors is recommended!"}}},"status":400}

what exactly is index.highlight.max_analyzed_offset param? and what is the maximum message size?
I've managed to increase the value of it by running below request:
".../filebeat-7.3.1-2019.09.12-000001/_settings"
{
"index" : {
"highlight.max_analyzed_offset" : 60000000
}
}
but after doing so when I go into Discover the error is gone but after playing with it kibana freezes after a minute or two every time.

So my question is what is the correct way of dealing with the issue?
Can I add something to my elasticsearch/kibana/filebeat configs so it will work everytime without doing anything with indexes?

Above errors message also says that "For large texts, indexing with offsets or term vectors is recommended", what does it mean exactly and how can I do this if it is in fact the correct way forward cause I wasn't able to do this either way.

From Kibana's perspective, you can disable highlighting completely (if that's what you prefer) to get rid of this error message by disabling the advanced setting, doc_table:highlight.

For answers to your questions about the correct way to deal with this issue, you may want to move this question into the Elasticsearch forums for a proper answer.

3 Likes

Ok, gonna try this and ask about it on elastic forum.
Thank you for your answer.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.