I was looking into using the Python elasticsearch helpers to perform bulk loads of data into ES. However the bulk loads that I would want to do would be into various indexes simultaneously (I am time slicing the data as it is purged if it is more than 30 days old). Doing this kind of insert with curl is super easy:
curl -XPOST localhost:9200/_bulk --data-binary @toload.json
However it looks like to do bulk loads with Python it has to be an iterative list that is all going to be inserted into the same index? Am I missing something super simple somewhere or do the helpers just not offer the ability to perform loads of bulk data across multiple indices?