Elasticsearch OpenJDK process takes over every available TCP port

Hi,

I am facing an issue with Elastic Search: In one of our PCs, The ES Java process took over every available TCP port on the system and held them, so the system entered TCP starvation and failed to continue functioning.

My questions are:

  1. what are the purpose of the many established ports related to the elastic process that I saw by netstat command ?
  2. Can I limit the ports that open?
  3. Do you have a guess what can be the reason for the issue? I can see in another machines that many ports are established but the issue above occur in only one machine (for now)

Thanks in advance.

Welcome to our community! :smiley:

Can you provide a little more information on what you were seeing? Do you have netstat output and errors from Elasticsearch?

Hi,

Thanks for your reply.
Here is some of our screenshots:


We know that the process that creates those connections is the ES process. (not shown in the image)
On different machines the number of connections is different, some of the machine have ~10 connections some of them ~30 and as it looks that the es on the specified machine used all the ports.
I Couldn't figure out (or find explanations on the web) what are those connections and what can cause ES to create a lot of such connections or how to restrict the number of port range for those connections.

Thank you again in advance!

Every Elasticsearch node will open 10-13 TCP channels to every other node in the cluster, and ≤18 TCP channels to any remote clusters. This is normal behaviour and even in a very large cluster it wouldn't be enough for you to run out of local ports.

1 Like

Do you think that security settings can cause opening ports in loop ? Can I limit the channels count for each node ? Thank you

The channel count is limited as described in my previous post.

There is something wrong with your system config if you are running out of ports after opening just 30 connections. You will need to discuss this with your local system administrator.

1 Like