Hello everyone,
I have a painless script that calculates user sessions. The script works fine but when I have more than 10k entry in the painless script loop throws this error:
"caused_by": {
"type": "painless_error",
"reason": "painless_error: The maximum number of statements that can be executed in a loop has been reached."
}
How we can deal with bigger data sizes with a painless script?
And here is my query:
{
"size": 0,
"query": {
"bool": {
"filter": [
{
"range": {
"time": {
"lt": "2020-07-01T10:32:43.868Z",
"gt": "2020-01-01T10:31:58.836Z"
}
}
}
]
}
},
"aggs": {
"sessions": {
"terms": {
"field": "user.id",
"size": 5,
"shard_size": 10
},
"aggs": {
"session": {
"scripted_metric": {
"init_script": "state.times = [];",
"map_script": "state.times.add(doc['time'].value.toInstant().toEpochMilli())",
"combine_script": "def start_session = state.times[0]; def total_time = 0; def treshold = 60; if(state.times.length <= 1) { return treshold; } for(int i=1;i<state.times.length;i++) { def difference = Math.abs((state.times[i] - start_session) / 1000); if(difference!=null) { if(difference < treshold) { total_time = total_time + difference; } else { total_time = total_time + treshold } } start_session = state.times[i]; } return total_time;",
"reduce_script": "return states[0]"
}
}
}
}
}
}
Thanks a lot.