Kibana Term Aggregation

Hello!! How are you?

My kibana is having issues when doing aggregation visualizations.
Here it is what is happening I have a SQL database as its source system, that has its data extracted by Logstash and feed the Elasticsearch index with the query (denormalized query joining 3 target tables).
Everything goes ok, no data gets lost along the way (I've extracted the data from Elasticsearch and compared against the SQL data by counting rows and summing amount fields).
However in Kibana when I do a metric count (like what I do in the Python Script and what I do in SQL SUM, or COUNT) the values don't match.

Can anyone help me please? This is bugging me for a long time now.

Beest Wishes,
Fernando Durier.

My setup now:
Elasticsearch running on IBM Openshift Cluster on IBM Cloud
Logstash running on a pod inside IBM Openshift Cluster on IBM Cloud
Kibana running on a pod inside IBM Openshift Cluster on IBM Cloud
Source Database -> DB2forZ/OS
Index -> replicas:1; number_of_rows:63290; shard:1;

P.s.: Even the counts on kibana don't match the number of docs in elasticsearch, e.g.: 63290 is the correct number of documents across my pipeline, and in Kibana it counts as 61,420.

I've rerun the count now, just for experimental purposes, and it changed again...

Hi @Fernando1

You probably need to adjust time picker to include all the injected data:

You can cross-check for the documents count on Discover as well.

Just to confirm: to get number of documents you are using GET _cat/indices/<your_index> on DevTools page.

Regards, Dzmitry

1 Like

Hi, Dzmitry!!
Thanks for answering.
My time picker is set from 1980 till nowadays. (And my oldest registry would be from 2017).
Still the count don't match.
But the command you asked me to try the GET_cat/indices/ worked and brought the expected value alright.

It seems to be the Kibana count aggregation function that is not working properly, or has a default config that I need to overwrite somehow.

Hey, Dzmitry in the end you are right!!
In fact my data oldest registry is from 2017, however, my data has also future markers, that was missing!! hahahhaha
Thanks again!!!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.