I'm facing "NoNodeAvailableException" but doesn't now what I'm doing wrong!

Hi

I create new topic because I did not find a solution to my pb in the other threads.

My pb is that I face the following error when I try to perform a search query on elasticsearch thanks a java client:

InetSocketAddress: NoNodeAvailableException[None of the configured nodes are available: [{#transport#-1}{mqrDOkH-Sbu4haRYxMekXg}{xxx.xxx.xxx.xxx}{xxx.xxx.xxx.xxx:9300}]]

Here is a curl where I can see my cluster informations:

curl -XGET 'xxx.xxx.xxx.xxx/'
{
  "name" : "my-node-name-1",
  "cluster_name" : "my-cluster-name",
  "cluster_uuid" : "lmhoyPBWQTu2SSt-3BabRg",
  "version" : {
    "number" : "5.4.0",
    "build_hash" : "780f8c4",
    "build_date" : "2017-04-28T17:43:27.229Z",
    "build_snapshot" : false,
    "lucene_version" : "6.5.0"
  },
  "tagline" : "You Know, for Search"
}

I think I have the right versions for the librairies I use to connect, here is my pom:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>test.elasticsearch</groupId>
    <artifactId>TestElasticSearch</artifactId>
    <version>1.0-SNAPSHOT</version>

    <packaging>jar</packaging>

    <name>transaction-generator</name>
    <url>http://maven.apache.org</url>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>

    <dependencies>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.9</version>
            <scope>test</scope>
        </dependency>
		<dependency>
			<groupId>org.elasticsearch.client</groupId>
			<artifactId>transport</artifactId>
			<version>5.4.0</version>
		</dependency>
		     <dependency>
            <groupId>org.apache.logging.log4j</groupId>
            <artifactId>log4j-api</artifactId>
            <version>2.7</version>
        </dependency>
        <dependency>
            <groupId>org.apache.logging.log4j</groupId>
            <artifactId>log4j-core</artifactId>
            <version>2.7</version>
        </dependency>        
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.0</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
        </plugins>
    </build>
</project>

And I'm using this class I found on this forum:

import java.net.InetAddress;
import java.net.InetSocketAddress;
import java.net.UnknownHostException;

import org.elasticsearch.action.get.GetResponse;
import org.elasticsearch.client.Client;
import org.elasticsearch.client.transport.NoNodeAvailableException;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.transport.InetSocketTransportAddress;
import org.elasticsearch.transport.client.PreBuiltTransportClient;

public class TrivialClient {

    public static void main(String[] args) throws UnknownHostException {
        String host = "xxx.xxx.xxx.xxx";
        InetSocketTransportAddress transportAddress = new InetSocketTransportAddress(
                InetAddress.getByName(host), 9300);
        createClientPrintResponse("getByName("xxx.xxx.xxx.xxx)", transportAddress);

        transportAddress =
                new InetSocketTransportAddress(new InetSocketAddress(host, 9300));
        createClientPrintResponse("InetSocketAddress", transportAddress);
    }

    private static void createClientPrintResponse(String description,
            InetSocketTransportAddress transportAddress) {
        System.out.println("begining : " + description + "... ");

        Settings settings = Settings.builder()
                .put("cluster.name", "my-cluster-name").build();
        Client client;
        client = new PreBuiltTransportClient(settings).
                addTransportAddress(transportAddress);
        try {
            GetResponse response = client.prepareGet("comicbook", "superhero", "1").get();
            System.out.println(description + ": " + response);
        } catch (NoNodeAvailableException e) {
            System.out.println(description + ": " + e);
            //e.printStackTrace();
        }
    }
}

Does somebody now what I'm doing wrong? Or what I miss?

Please format your code using </> icon as explained in this guide. It will make your post more readable.

Or use markdown style like:

```
CODE
```

I edited your post.

What are the elasticsearch server full logs please?
Are you running the java app from the same machine where elasticsearch is hosted?

Sorry for the malformed post!

I'm running my code from my machine and the elasticsearch is hosted on another machine.
But the curl I posted works fine from my machine.

For the moment, we do not see anything in the logs on the elasticsearch host (nor on the other nodes of the cluster.)

Here is the /etc/elasticsearch/elasticsearch.yml file of the concerned machine.

cluster.name: my-cluster-name
node.name: my-node-name-1
path.conf: /etc/elasticsearch
plugin.mandatory: discovery-ec2
discovery.type: ec2
discovery.ec2.groups: SG-PROD-LOGGING-ELASTICSEARCH-RTB
discovery.ec2.host_type: private_ip
cloud.aws.region: us-east
http.enabled: true
http.port: 9200
transport.tcp.port: 9300
bootstrap.memory_lock: true
node.master: false
node.data: false
node.ingest: false
network.host: xxx.xxx.xxx.xxx
discovery.ec2.tag.escluster: nv-rtb5
discovery.zen.minimum_master_nodes: 2
search.default_search_timeout: 30s
indices.fielddata.cache.size: 40%

Note that ip "xxx.xxx.xxx.xxx" in the elasticsearch.yml, in the TrivialClient.java and in the curl is the same

So network.host has the public IP address, right?

Everything looks good.

I wonder if you opened the 9300 port in AWS?

BTW I think it's a bad idea exposing your elasticsearch on internet.
At least you should add X-Pack (commercial) to protect it.

(Or consider cloud.elastic.co which has all that built-in and managed).

Thanks a lot!!!!

We've found the issue with your helping questions!
It was related to the opened port (so obvious... :frowning: )

We do not use public Ip and are hosted and the elasticsearch we use are not reachable outside our VPN.

That makes me wonder why we need to connect to the 9300 port (that seems to be the port used by the cluster itself) with the java client while I wanted to perform a search query on the 9200 port (like I do with my curl)?

Use the Java REST client instead.

Ho, much better!
That was the solution I looked for!

Thanks David!

Maybe you know a good/easy library to parse the response and store it as csv?

parse the response

The High Level Rest Client can parse the response.
Otherwise you can use Jackson to read the response stream and parse it as a Map or whatever.

store it as csv

Not a library as per say but Logstash can do that.

Thanks David.

The HighLevel Rest Client does what I did myself (in a better way).
The remaining thing I need to do is loading my query from a file (I'm looking into Script for that).

Regarding the csv, my goal, is not exactly to generate a csv file, but to save data "row by row" in a redshift like I can see in some kibana visualization. I'll look what I can do.

Have a look at the scroll API if your intention is to extract a lot of data.

Thanks for the advice.
i'll take a look at this

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.