Connection pooling in Python

bry-c,
Thanks for your comments.
I am using flask & apache2. My existing connection pooling implementation is similar to what you have mentioned.
After your reply, I re-read the following links/sections which gave me the required details & clarifications I had missed out initially

https://elasticsearch-py.readthedocs.io/en/master/#persistent-connections
elasticsearch-py uses persistent connections inside of individual connection pools (one per each configured or sniffed node)

https://elasticsearch-py.readthedocs.io/en/master/#thread-safety
By default we allow urllib3 to open up to 10 connections to each node, if your application calls for more parallelism, use the maxsize parameter to raise the limit:

#maxsize parameter for connection poolsize
es = Elasticsearch(["host1", "host2"], maxsize=25)

Query

  1. Here, I am assuming that the parameter "maxsize" is for no. of persistent connections per confirgured node.
  2. As per the thread-safety link above :
    If your application is long-running consider turning on Sniffing to make sure the client is up to date on the cluster location.

My application is long running almost 24x7, so should I turn on sniffing mechanism?

Thanks & Regards,
Sachin Vyas.