HI,
I am facing issue with ES hadoop spark, Exception trace below. I dont find any ES-hadoop or spark connector for scala 2.11. Any pointer ?
I am using elasticsearch-hadoop with 2.3.3 and scala version 2.11.7 everywhere
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "2.3.4
Output of below command
sbt "inspect tree clean" | grep 'org.scala-lang'
[info] | +-:ivyScala = Some(IvyScala(2.11.7,2.11,List(),true,false,false,org.scala-lang))
[info] | | +-/:scalaOrganization = org.scala-lang
[info] | +-:libraryDependencies = List(org.scala-lang:scala-library:2.11.7, org.scoverage:scalac-scoverage-runtime_2.11:1.1.1:provided, org.scoverage:scalac-scoverage-plugin_2.11:1.1.1:provided, org.elasticsearch:elasticsearch-..
[info] | | +-/:scalaOrganization = org.scala-lang
[info] | +-/:scalaOrganization = org.scala-lang
[info] +-:ivyScala = Some(IvyScala(2.11.7,2.11,List(),true,false,false,org.scala-lang))
[info] | +-/*:scalaOrganization = org.scala-lang
---------------Stack trace-------------
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at org.elasticsearch.spark.rdd.AbstractEsRDDIterator.(AbstractEsRDDIterator.scala:10)
at org.elasticsearch.spark.rdd.ScalaEsRDDIterator.(ScalaEsRDD.scala:25)
at org.elasticsearch.spark.rdd.ScalaEsRDD.compute(ScalaEsRDD.scala:21)
at org.elasticsearch.spark.rdd.ScalaEsRDD.compute(ScalaEsRDD.scala:15)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)