So I have a simple challenges file:
5 {
6 "schedule": "poisson"
7 },
8 {
9 "operation": "index-append",
10 "schedule": "poisson"
11 },
12 {
13 "operation": "phrase",
14 "iterations": 1000,
15 "schedule": "poisson"
16 }
17 ]
18 }
I run this command:
esrally --pipeline=benchmark-only --target-hosts host-to-test:80
I see the tool output:
Running index-append [100% done]
Running phrase [100% done]
I see the log of the requests on the host:
...many before...
[2018-07-31T00:17:04+00:00] 10.20.0.3 - "c1c-searchclienta101-prod" "POST /_bulk HTTP/1.1" 200 1285036 "-" "-" "-" 1.200 - "-"
[2018-07-31T00:17:06+00:00] 10.20.0.3 - "c1c-searchclienta101-prod" "POST /_bulk HTTP/1.1" 200 1285036 "-" "-" "-" 1.167 - "-"
[2018-07-31T00:17:06+00:00] 10.20.0.3 - "c1c-searchclienta101-prod" "POST /_bulk HTTP/1.1" 200 231335 "-" "-" "-" 0.075 - "-"
[2018-07-31T00:17:11+00:00] 10.20.0.3 - "c1c-searchclienta101-prod" "GET /geonames/_search?request_cache=false HTTP/1.1" 200 4358 "-" "-" "-" 0.006 - "-"
[2018-07-31T00:17:11+00:00] 10.20.0.3 - "c1c-searchclienta101-prod" "GET /geonames/_search?request_cache=false HTTP/1.1" 200 4358 "-" "-" "-" 0.005 - "-"
[2018-07-31T00:17:11+00:00] 10.20.0.3 - "c1c-searchclienta101-prod" "GET /geonames/_search?request_cache=false HTTP/1.1" 200 4358 "-" "-" "-" 0.004 - "-"
...many after...
But I want my operations to emulate the traffic in production though. Meaning that some of them would be bulk at a certain rate and have some search in between, bulks again, search again etc... defining all those individually would be quite a pain. Is there a way to schedule several operations in parallel? So my requests for all of them come in at random?
I was trying to search for something like this in the docs, but seems like the only thing I see that could be in the right direction is custom scheduler? I don't know if I should write this.