Rollup - date histogram issues

I have a rollup job with the following settings:

          "groups": {
            "date_histogram": {
              "field": "timestamp",
              "time_zone": "Europe/Stockholm",
              "calendar_interval": "1d"
            },

I also tried "calendar_interval": "24h" btw

As I understand from the documentation about rollups, using a smaller date histogram interval is not possible, but using a bigger date histogram is. Therefore, I'm expecting when I create a Lens visualisation, I will be able to choose 1d, 2d, 1w, 1M, etc. when the "calendar_interval": "1d" and similar scenarios with the 24h setting (48h for example)

What I'm getting in Lens is the following error

I've tried other visualisations too, nothing really works

What I want to achieve is having daily totals of our transaction data and building dashboards with date histograms with intervals of 1d, 1w, 1M and 1y (ideally all calendar based since reporting in this business is calendar based).

Hopefully someone can point me in the right direction on how to achieve this?

Jeroen

Funny enough, this works fine:

POST /lp-reporting-transactions-daily/_search?size=0
{
  "aggs": {
    "transactions": {
      "date_histogram": {
        "field": "timestamp.date_histogram.timestamp",
        "calendar_interval": "month"
      }
    }
  }
}

same for week and year

It feels like the issue is related to Lens and other visualisations?

Hi @JeroenK, Rollups are in technical preview and may not be around forever. The support in Kibana (Lens included) is limited.

The time series data stream with downsampling is a "blessed" solution moving forward. Does it look like it would fit your needs?

Thanks @Andrew_Tate that explains things

I was looking at downsampling indeed, but what I could not figure out is if you loose any data? It reads like it 'samples' from the data instead of actually rolling it up.

My case is for analytical data about transactions and I want to aggregate what we have to daily totals. Will downsampling do that?

@neoaddix , with both rollups and TSDS downsampling you're taking a highly-sampled index (lots of documents) and transforming it into an index with fewer documents each of which summarizes some number of documents from the original index. So, really they're both "lossy compression" techniques and they could both be called "downsampling."

Each of the documents in the downsampled index stores aggregation information for the documents from the original index that it represents. In the case of TSDS downsampling, we store the min , max , sum , value_count , and average for each metric. (IIRC, rollups allow you to specify which of these you retain.)

The object is to reduce storage costs while improving performance for certain aggregations (fewer documents to aggregate = faster).

You can imagine it like compressing an image where every four pixels becomes one single pixel, the average of the original four. When the image is at its normal size after compression, you probably won't notice a difference. However, the compression did reduce your ability to zoom in and see more granular details.

(Sorry if you knew all this, just stating it "out-loud.")

Full disclosure: this is a little outside my area of expertise—might want to verify this in the Elasticsearch topic.

That said, I think this is probably the perfect case for downsampling. As you can see in the docs, you can downsample with an interval of one day "fixed_interval": "1d" (so in your downsampled index, you get one document summarizing each day). And you can do this with an ILM rule so that it happens automatically.

You could always run a downsample using the API first to make sure things look as they should. Just remember that

Within a data stream, a downsampled index replaces the original index and the original index is deleted. Only one index can exist for a given time period.

so, when you run the downsample command, the original documents are gone (from doc).

Does this help?

Thanks @Andrew_Tate this is very helpful, I will play with this to validate if it covers what I need. I'll post my results here

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.