Kibana not aggregating on large values of a field

I am using Kibana 5.5.2 and Elasticsearch version 5.5 (on AWS)
I am trying to generate pie chart and trying to add a sub bucket based on Terms aggregation of a field. I find that if the value of the field is bigger than a limit, the aggregation returns no result.
and my pie chart looks like this as a result:

As you can see the missing sector is there because aggregation didn't return anything for those buckets. The value of the field where aggregation is not returning anything is something like: org.dummy.common.DummyException: Failed to deserialize object of type org.dummy.datamodel.DummyModel from file /root/workdir/directory/Name/subdirectory/objectname/package/abc.def -> at org.dummy.main.deserializeObject.deserializeObjectClass(

on the other hand values like
org.dummy.common.DummyException: Object with name: 'dummyexample' already exists -> at org.dummy.processwork.Manager.operation(
are being aggregated and show as can be seen in the pie chart.
Any help guys ?

Hey @Harsh_Verma can you check your index mappings for that field, specifically the ignore_above setting? Are the strings that aren't showing up above that character limit?

@Brandon_Kobel, yes it is set to 256, and the string I am trying to aggregate is 264 . How can I change it ? Although I am able to see fields of any length in the Discover tab, just the visualize tab ignores them

The Discover tab will display the entire source document, so that's why you're seeing that string there.


Its value can be updated on existing fields using the PUT mapping API.

@Brandon_Kobel, so if I update the ignore_above value for a field, will Elasticsearch retroactively index the documents containing the field that are already stored ? So that my pie-chart starts showing the missing values ?

You’ll have to reindex your data with the new setting to retroactively fix your data.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.