# Show in Kibana - DAU/MAU and WAU/MAU

Hello! I'm trying to get a line chart that shows the calculations of DAU/MAU and WAU/MAU (DAU = Daily Active Users, WAU = Weekly, MAU = Monthly). Thanks to another user's help, I was able to get a graph showing DAU and I could do the same thing to get a graph with WAU, but what I'm really interested in is the ability to graph that formula. To give a little context - every user activity is logged as a single record (with a userId value) in an index. The graphs group those records per day unique'd by userId and then plot them on a line chart. Is there a way to take that value and divide it by the MAU in a single graph?

Here's an example if my above explanation wasn't clear:
Imagine I have 1000 daily active users today, 2000 yesterday, and 1500 the day before. I can easily graph that, but what I really want to do is take each day and divide it by the total number of active users in the previous 30 days. So for today, I'd divide 1000 by the number of active users in the past 30 days, yesterday I would divide 2000 by the total number of users active in the past 30 days shifted over 1 day, and so on.

Let me know if that makes sense or if I can clarify anything!

Host: Elastic Cloud
Elastic Version: 8.7

Anyone have any ideas here?

First Are we talking about Unique Users Correct?

a) Daily Active Users = Daily Unique Users

b) Monthly Active Users = Total Unique Users over 30 Days calculated each day.

Here is why this is not simple to do in Elasticsearch

The graphs are based on [Date Histograms](https://www.elastic.co/guide/en/elasticsearch/reference/current/search-aggregations-bucket-
datehistogram-aggregation.html) and A and B have different bucket sizes with

A being a 1day bucket and , which is pretty easy and makes sense

B being a 30 Day Bucket
you can't combine 2 different bucket sizes into a single aggregation (or at least I can't )
AND what you really want is to Calculate the Moving 30 Day Total Unique Users Every Day ... which is not just at 30 Day or monthly histogram so that is more difficult

So if you really want to DAU / MAU

Probably the easiest way is to write a little script that runs 2 aggregations and post the results into a single index. Run this script once a day.

Here are 2 simple aggs
The first one is distinct users last 24 hours
Second, is the distinct users last 30 Days.
Note these are not Date Histograms, they are range queries with aggregations.
Run the 2 scripts once a day and put the results into an index with both values into a single document then you can do what you please with the formula

``````# Disctint Users last 24 Hours
GET /kibana_sample_data_ecommerce/_search
{
"size": 0,
"query": {
"bool": {
"filter": [
{
"range": {
"order_date": {
"gte": "now-24h/h",
"lte": "now"
}
}
}
]
}
},
"aggs": {
"distinct_users": {
"cardinality": {
"field": "customer_id"
}
}
}
}

# Disctint Users last 30 Days
GET /kibana_sample_data_ecommerce/_search
{
"size": 0,
"query": {
"bool": {
"filter": [
{
"range": {
"order_date": {
"gte": "now-30d/d",
"lte": "now"
}
}
}
]
}
},
"aggs": {
"distinct_users": {
"cardinality": {
"field": "customer_id"
}
}
}
}
``````

I looked at Transform it is possible there is some magic incantation to make it work but I could not figure it out.

There is some interesting cumulative cardinality but I don't think that helps you ....

To be clear DAU is easy to Graph with Lens
MAU if you do 1M buckets it will only show once a month which is not what you want but can certainly be graphed.

Perhaps someone else might have a suggestion.

Regarding the transform questions:

static monthly: By default transform calculates a bucket after it is complete, e.g. after the month is over. You can tweak this behavior using `settings.align_checkpoints:false`. This way it calculate intermediate results or in other words it keeps updating the current month.

moving 30 days: You can use a range query with `now-30d/d`, however transform requires something to group on. That can be a terms aggregation on e.g. an application name, that way you get it by application, even if you only have one.

Hey @stephenb that makes sense to me! Can you point me in the right direction for writing scripts that run daily and post to an index? That's something I've never worked with in ELK. I'm running everything on ElasticCloud.

@pocketcolin

Elastic is just a REST Endpoint so pick your favorite scripting tool, certainly if you are a python dev that is a very popular lang.

You can use the HTTP REST API Directly

Or one of the Language Clients

With respect to Elastic Cloud ... no difference it is just an Endpoint

Get started come back and open a new thread with some questions when you are ready.

Heck you could probably do as simple with cron, bash script, curl and jq

https://jqlang.github.io/jq/

Ok makes sense! I'm going to go ahead and mark this as resolved as it seems like that's the best direction to go from here. Thanks for all your help here!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.