Hi,
I have created 2 rollup jobs on the same raw index:
- one for daily rollup
- one for weekly rollup
Since I started the second jobs (weekly) it seems that the first (daily) doesn't performs its operations.
Is it possible to start two rollup job building 2 different rollup index from the same raw index ?
See the follwing 2 jobs:
PUT xpack/rollup/job/roll_1d
{
"index_pattern": "pfs_pnsapi*_dfy",
"rollup_index": "roll1d_pfs_pnsapi-_dfy",
"cron": "0 7 14 * * ?"",
"page_size" :10000,
"groups" : {
"date_histogram": {
"field": "@timestamp",
"interval": "1d",
"delay": "7d",
"time_zone":"UTC"
},
"terms": {
"fields": ["tag.env","tag.svcid","data.q_client","data.q_method","data.q_submethod","data.stream_id","data.r_status"]
}
},
"metrics": [
{
"field": "data.payloadsize",
"metrics": ["min", "max", "sum", "avg"]
},
{
"field": "data.message_count",
"metrics": ["min", "max", "sum", "avg"]
}
]
}
PUT xpack/rollup/job/roll_1w
{
"index_pattern": "pfs_pnsapi*_dfy",
"rollup_index": "roll1w_pfs_pnsapi-_dfy",
"cron": "0 0 10 ? * 2",
"page_size" :10000,
"groups" : {
"date_histogram": {
"field": "@timestamp",
"interval": "1w",
"delay": "7d",
"time_zone":"UTC"
},
"terms": {
"fields": ["tag.env","tag.svcid","data.q_client","data.q_method","data.q_submethod","data.stream_id","data.r_status"]
}
},
"metrics": [
{
"field": "data.payloadsize",
"metrics": ["min", "max", "sum", "avg"]
},
{
"field": "data.message_count",
"metrics": ["min", "max", "sum", "avg"]
}
]
}
Best regards