Spark 1.5.1 + Elastic Search Integration

SBT dependency

libraryDependencies += "org.elasticsearch" % "elasticsearch-spark_2.10" % "2.2.0-m1"

Leads to following error:

java.lang.NoClassDefFoundError: org/elasticsearch/spark/rdd/EsSpark$

// Code snippets

import org.elasticsearch.spark._
import org.elasticsearch.spark.rdd.EsSpark

.set("es.nodes", "localhost")
.set("es.port", "9200")
.set("es.index.auto.create", "true")

val numbers = Map("one" -> 1, "two" -> 2, "three" -> 3)
val airports = Map("arrival" -> "Otopeni", "SFO" -> "San Fran")

//ssc.sparkContext.makeRDD(Seq(numbers, airports)).saveToEs("spark/docs")
val esRDD = ssc.sparkContext.makeRDD(Seq(numbers, airports))
EsSpark.saveToEs(esRDD, "spark/docs")

Looking forward for help.

It's a classpath error - make sure the jar has the right scope (compilation).

Getting this same error. Were you able to solve it?