Access elasticsearch from public ip instead of LAN ipv4

I have elasticsearch installed in one machine with

  1. local LAN ipv4 = 192.168..
  2. public ip = 39...*

i have
http.port: 9200
network.host: 0.0.0.0 in my elasticsearch.yml

I can access this with in same network: ipv4:9200 but i want to access elasticsearch from different machine with publicip:9200, but i can't access it.

Please read Is it safe to expose Elasticsearch to the Internet? first.

That means Elasticsearch will use any and all IPs that are attached to network interfaces on the host.

Is there a firewall in place?

Yes, I have also created inbound rules for port 9200 and 9300.

Before creating rules, i was not even getting access with ipv4:9200. But i am now getting access with ipv4:9200.

Now i want to access elasticsearch from different machine with diferent network with http://publicip:9200

I am getting "site can't be reached"

You do not want to expose port 9300 to the internet, there's no reason to at all.

Can you curl the public IP on 9200 to get a response? Are you sure that IP is attached to a network interface on the host?

There is no response is in curl.
Can you tell how should i attach my public ip to the network interface on the windows host?

That depends on the OS.

i am using windows 10.

ipconfig in a command prompt should show you.

ipconfig shows me [localIP] which is 192.168..

I want to access elasticsearch with public ip from different machine which is 39.xxx.xxx.xxx

If it doesn't show the public IP then you will need to get it assigned to an interface.
That's outside the scope of what we can help with sorry.

Hi @Abdul_Samad I just want to add to what @warkolm said.

1st please read the discussion that he posted, it is very important.

2nd so it is exactly clear if you make Elasticsearch as you describe above using http on a truly public IP anyone on the Internet can read, create, delete alter your data. At the very least secure it with SSL and Basic Authentication.

See here

This is most likely a network configuration issue than an Elasticsearch issue. While forwarding ports from the internet facing IP of your router/firewall will allow public clients to access Elasticsearch. It will not allow private/internal clients to access Elasticsearch via the same public IP. Your router/firewall must support and be configured for NAT "hairpinning" to do that.

Even if you can configure NAT hairpinning, I would recommend against this, as it means all traffic to/from Elasticsearch MUST traverse your router/firewall, only to be turned around and sent back out the same interface that it entered. This unnecessarily adds latency and impacts performance.

A better solution is to leverage DNS to direct traffic to Elasticsearch by hostname or FQDN. Your external DNS provider would be configured to resolve public DNS requests to the public IP, while your internal DNS would be configured to resolve to the private IP. This ensures that both internal and external clients can reach Elasticsearch, even if your router/firewall doesn't support hairpinning, and with the best performance.

1 Like

HelloAbdul,
When a service/app is started it is looking for the assigned network interface to.listen to, having it 0.0.0.0 it means it accepts data and can talk to everyone no matter from where the query comes, so if you have 3-4 network cards on that server with different IPs it means you can insert data from different networks, internal or external, even from localhost ( 127.0.0.1 ),. So let's say you have 2 network cards with IPs 192.168.0.2, 10.10.0.2 and 127.0.0.1. If you set 0.0.0.0 as the ip to listen to, everyone can i sert data or query your db, from 10.10.x.x and 192.168.0.x and 127.0.0.1. If you want it to be accesible from the internet ( which is not really recomended ) you need to know very well your usecase.

If you are behind a personal gateway ( router provided by your ISP ) i would make the ip in the config to be the internal ip of the server that provides internet access ( 192.168.x.x and i recomend port forwarding in the gateway from the public IP to your local on specific ports that you need. DO NOT set the gateway in DMZ towards your elasticsearch internal IP as all the ports of that machine will be exposed to the internet and you open the door for more vulnerabilities not only the database. You can then create rules in the machine's firewall to allow access on those ports only from specific machines using mac address filtering or ip filtering or any other trust mechanism you see fit ( tls and ssl auth might add another layer of security that you need )

If you are behind a corporate firewall make sure your IT dept makes comprehensive rules from the public ip to your elasticsearch server.

If it would be me wanting to provide a customer or some friend access to my data using kibana, grafana or any other custom tool that does not have access to my private network i would vpn the kibana in my network, mKe it talk with the elasticsearch and expose that kibana/grafana/whatever.tool machine only on port 5601 or the proxy port you use but i would stillenforce authentication between the 2 with tls, also activate user authentication and well defined user rights per group, that way not everyone can write or delete from your db.
This is if you want to play and se how things work.

As best practices, i would use one of the main rules in database security : NEVER EXPOSE YOUR DB TO THE INTERNET unless it'sa honeypot

Hope this helps