Hi,
I have a standalone setup of ELK stack version 7.12.1. in windows machine. I have ingested examination data from csv file to kibana using logstash. Using this data I had to show total students count. I did this using metric visualization as shown below.
This gives me the correct count of students. Below is the request for this visualization.
{
"aggs": {
"1": {
"cardinality": {
"field": "EEID.keyword"
}
}
},
"size": 0,
"fields": [
{
"field": "@timestamp",
"format": "date_time"
},
{
"field": "exam_date",
"format": "date_time"
}
],
"script_fields": {
"Negative Mark": {
"script": {
"source": "0.25",
"lang": "painless"
}
}
},
"stored_fields": [
"*"
],
"runtime_mappings": {},
"_source": {
"excludes": []
},
"query": {
"bool": {
"must": [],
"filter": [
{
"match_all": {}
},
{
"range": {
"@timestamp": {
"gte": "2021-03-11T07:13:49.217Z",
"lte": "2021-06-09T07:13:49.217Z",
"format": "strict_date_optional_time"
}
}
}
],
"should": [],
"must_not": []
}
}
}
Now, I also have same standalone production environment with ELK stack version 7.9.0. in centOS machine. I ingested data to production using logstash 7.12.1. I want to create same use case as in my local setup - total count of students. But I am getting count mismatch when using metric visualization.
The request for above visualization
{
"aggs": {
"1": {
"cardinality": {
"field": "EEID.keyword"
}
}
},
"size": 0,
"stored_fields": [
"*"
],
"script_fields": {
"value_field": {
"script": {
"source": "1",
"lang": "painless"
}
}
},
"docvalue_fields": [
{
"field": "@timestamp",
"format": "date_time"
},
{
"field": "exam_date",
"format": "date_time"
}
],
"_source": {
"excludes": []
},
"query": {
"bool": {
"must": [],
"filter": [
{
"match_all": {}
},
{
"range": {
"@timestamp": {
"gte": "2021-03-11T07:30:34.150Z",
"lte": "2021-06-09T07:30:34.150Z",
"format": "strict_date_optional_time"
}
}
}
],
"should": [],
"must_not": []
}
}
}
And when checking the count using data table visualization it gives correct count.
Kindly help.
Thank you,
Regards Abhishek