How to increase message size limit of elasticsearch API (max_clause_count)

Hello elastic world,

I'm currently trying to solve a problem regarding log display in Kibana.
I'm using elasticsearch 6.6.1 and Kibana 6.6.1.
When creating vizualization for my log, the data set is missing every long log message (>~300 char) despite the messages are present in elasticsearch.
I've tried to use the following configuration change in elasticsearch.yml

indices.query.bool.max_clause_count: 4096

But this doesn't solve my problem. Is there a issue with long message with 6.6.1 ?
Or do I miss a configuration ?
Thanks !

Hi Guillaumee,

That setting is about the maximum number of terms that can be used in a query so is not appropriate.
I think the issue is that long messages are actually missing from the index.
Lucene has a hard limit on the length of indexed values so index mappings tend to set limits for
keyword fields (see ignore_above setting)

You end up with too many large unique values in these situations so it would make more sense to do one of the following:

  1. Truncate the values (this can be done in the mapping)
  2. Pre-process them to recognise certain long messages and swap for shorter reason-codes
  3. Hash the values of the text messages (this can be done in the mapping or an ingest pipeline)
1 Like

Thanks Mark I solved the problem by updating the mapping of the index :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.