How to send same eventdata track 1billion json documnets to multiple ES instances

Hi ,

I have configured 4 ES instances independently in single server by providing unique values (for each ES instances) for below commands in Elasticsearch.yml.

  • http.port
  • transport.tcp.port
  • path_data
  • path_logs
  • path_pid
  • node.name
  • cluster.name

Successfully 4 ES instances are running in my server with different ports.

I have started the Esrally with below commands

esrally --track=eventdata --track-repository=eventdata --pipeline=benchmark-only --challenge=elasticlogs-1bn-load --track-params="/home/ssuvarna/parameter_custom_evendata.json" --target-hosts="target_host.json" --report-file=/home/ssuvarna/report.md --report-format=csv &

target_host.json:

{
"default": [
{"host": "127.0,0,1", "port": 9200},
{"host": "127.0,0,1", "port": 9202},
{"host": "127.0,0,1", "port": 9203},
{"host": "127.0,0,1", "port": 9204}
]
}

Currently Esrally sending json documents to each ES instances .. and please find the below issue:

1)We are expecting Esrally to send 1Billion Json documents to each ES instances but we can see its sending "250,000,000" json documents to each ES instances (it means 1Billion/4 ES instance="250,000,000" json docs).
2)Each ES instances should have indices name as "elasticlogs_q-000001" but only one ES instance will have indices name as "elasticlogs_q-000001" and rest of 3 ES instances will have indices name as "elasticlogs_q_write"

Can you please let me know what changes I have to do on esrally to get the expected behavior ..

Rally expects to work with a cluster and will distribute the load across the nodes you list. I do not believe you can get it to send the same load to multiple clusters during a single run.

Thanks a lot for info ..

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.