Hi everyone, I was able to succesfully retrieve the data I need - my top 10 most ERROR log producing applications - using this query :
GET _search
{
"size": 0,
"query": {
"bool": {
"must": [
{
"bool": {
"minimum_should_match": 1,
"should": [
{
"match_phrase": {
"level": "error"
}
},
{
"match_phrase": {
"level": "ERROR"
}
},
{
"match_phrase": {
"level": "ERR"
}
}
]
}
},
{
"range": {
"@timestamp": {
"gte": 1644875811155,
"lte": 1647464211155,
"format": "epoch_millis"
}
}
}
]
}
},
"aggs": {
"group_by_app": {
"terms": {
"field": "app"
}
}
}
}
I am getting back a JSON with the apps and their counts.
{
"took" : 28680,
"timed_out" : false,
"num_reduce_phases" : 2,
"_shards" : {
"total" : 4961,
"successful" : 4831,
"skipped" : 4234,
"failed" : 130,
"failures" : [
...
]
},
"hits" : {
"total" : 8417850,
"max_score" : 0.0,
"hits" : [ ]
},
"aggregations" : {
"group_by_app" : {
"doc_count_error_upper_bound" : 24619,
"sum_other_doc_count" : 2032266,
"buckets" : [
{
"key" : "runaway-train",
"doc_count" : 2014561
},
{
"key" : "minor-contender",
"doc_count" : 623726
},
{
"key" : "not-a-concern",
"doc_count" : 594091
},
{
"key" : "work-as-usual",
"doc_count" : 300578
},
{
"key" : "almost-invisible",
"doc_count" : 268338
}
]
}
}
}
I now need to use it to create a bi-weekly email with a report that I'd like to look a lot like kibana's own page :
Can this be achieved inside the ELK stack or should I take the data out and run it through some python and email it?
Many thanks,
Dan