NoClassDefFoundError: Could not initialize class org.elasticsearch.common.network.NetworkService

public final static TransportClient getClient() {
synchronized (EsUtil.class) {
if(client == null)
try {

                Settings settings = Settings.builder()
                        .put("cluster.name", clusterName)
                        .put("client.transport.sniff", true)
                        .build();

                client  = new PreBuiltTransportClient(settings)
                                .addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("10.6.80.104"), 9320))
                                .addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("10.6.80.119"), 9320));
            } catch (UnknownHostException e) {
                e.printStackTrace();
            }
    }
    return client;
}

I want to submit spark job from local to remote cluster.When i use spark.master =local[*],it works well.But i change it to my remote cluster uri,it throws exception like topic,anyone else know how to solve?My es-version is 5.3.0 and i have try 5.2.2 but it doesn't work.

This does not look ES-Hadoop related, but rather an issue with the Elasticsearch transport client. Have you thought about using ES-Hadoop for your spark workloads instead?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.