hello
my context : spark, spark-shell 1.0.1 jdk1.7 scala 2.10.4, ES-Hadoop
2.1.0 ( nighly build)
my problem:
unable to send RDDs from spark to ES
i got a NoClassDefFoundError see below (
org/codehaus/jackson/annotate/JsonClass)
jackson Jars to add to spark shell?
philippe
best regards
$bin/spark-shell --jars
/usr/lib/spark-1.0/lib/elasticsearch-hadoop-2.1.0.jar
......
spark version 1.0.1
Using Scala version 2.10.4
..............
14/08/06 17:19:36 INFO SparkContext: Added JAR
file:/usr/lib/spark-1.0/lib/elasticsearch-hadoop-2.1.0.jar
scala>
import org.elasticsearch.hadoop.mr.EsOutputFormat
import org.elasticsearch.hadoop.mr.EsInputFormat
import org.elasticsearch.hadoop.cfg.ConfigurationOptions
import org.apache.hadoop.mapred.{FileOutputCommitter, FileOutputFormat,
JobConf, OutputFormat}
import org.apache.hadoop.fs.Path
import org.apache.hadoop.io.{MapWritable, Text, NullWritable}
val jobConf = new JobConf(sc.hadoopConfiguration)
jobConf.set("es.resource", "myindex/mytype")
jobConf.set("es.query", "?q=")
val esRDD = sc.hadoopRDD(jobConf,classOf[EsInputFormat[Text,
MapWritable]],classOf[Text],classOf[MapWritable])
// up to there everything ok *
es.count()
---->
java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass
at
org.codehaus.jackson.map.introspect.JacksonAnnotationIntrospector.findDeserializationType(JacksonAnnotationIntrospector.java:524)
.........
at
org.elasticsearch.hadoop.rest.RestClient.parseContent(RestClient.java:119)
sorry for the mistake : --> unable to read from ES and create RDDS
On Wednesday, August 6, 2014 6:32:02 PM UTC+2, Phil gib wrote:
hello
my context : spark, spark-shell 1.0.1 jdk1.7 scala 2.10.4, ES-Hadoop
2.1.0 ( nighly build)
my problem:
unable to read from ES and create RDDS
i got a NoClassDefFoundError see below (
org/codehaus/jackson/annotate/JsonClass)
jackson Jars to add to spark shell?
Glad to see the work in es-hadoop master is being picked up even without any public announcement of it
The issue has been fixed in master [1] and already pushed to Maven - can you please update and try again?
FTR: The issue seems to be caused by multiple versions of Jackson which are pulled in the classpath (one from Hadoop,
another from Spark)
which on some platforms, causes class loading issues in Jackson during start-up. The fix in master hopefully remedies that.
hello
my context : spark, spark-shell 1.0.1 jdk1.7 scala 2.10.4, ES-Hadoop 2.1.0 ( nighly build)
my problem:
unable to send RDDs from spark to ES
i got a NoClassDefFoundError see below ( org/codehaus/jackson/annotate/JsonClass)
jackson Jars to add to spark shell?
philippe
best regards
$bin/spark-shell --jars /usr/lib/spark-1.0/lib/elasticsearch-hadoop-2.1.0.jar
......
spark version 1.0.1
Using Scala version 2.10.4
..............
14/08/06 17:19:36 INFO SparkContext: Added JAR file:/usr/lib/spark-1.0/lib/elasticsearch-hadoop-2.1.0.jar
scala>
import org.elasticsearch.hadoop.mr.EsOutputFormat
import org.elasticsearch.hadoop.mr.EsInputFormat
import org.elasticsearch.hadoop.cfg.ConfigurationOptions
import org.apache.hadoop.mapred.{FileOutputCommitter, FileOutputFormat, JobConf, OutputFormat}
import org.apache.hadoop.fs.Path
import org.apache.hadoop.io.{MapWritable, Text, NullWritable}
val jobConf = new JobConf(sc.hadoopConfiguration) jobConf.set("es.resource", "myindex/mytype")
jobConf.set("es.query", "?q=*")
val esRDD = sc.hadoopRDD(jobConf,classOf[EsInputFormat[Text, MapWritable]],classOf[Text],classOf[MapWritable])
//*up to there everything ok *
es.count()
---->
java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass
at
org.codehaus.jackson.map.introspect.JacksonAnnotationIntrospector.findDeserializationType(JacksonAnnotationIntrospector.java:524)
.........
at org.elasticsearch.hadoop.rest.RestClient.parseContent(RestClient.java:119)
Hi Costin
Thanks for your help!
it works now after the update
Best Regards
phil
On Wednesday, August 6, 2014 6:46:57 PM UTC+2, Costin Leau wrote:
Hi Phil,
Glad to see the work in es-hadoop master is being picked up even without
any public announcement of it
The issue has been fixed in master [1] and already pushed to Maven - can
you please update and try again?
FTR: The issue seems to be caused by multiple versions of Jackson which
are pulled in the classpath (one from Hadoop,
another from Spark)
which on some platforms, causes class loading issues in Jackson during
start-up. The fix in master hopefully remedies that.
hello
my context : spark, spark-shell 1.0.1 jdk1.7 scala 2.10.4, ES-Hadoop
2.1.0 ( nighly build)
my problem:
unable to send RDDs from spark to ES
i got a NoClassDefFoundError see below (
org/codehaus/jackson/annotate/JsonClass)
jackson Jars to add to spark shell?
philippe
best regards
$bin/spark-shell --jars
/usr/lib/spark-1.0/lib/elasticsearch-hadoop-2.1.0.jar
......
spark version 1.0.1
Using Scala version 2.10.4
..............
14/08/06 17:19:36 INFO SparkContext: Added JAR
file:/usr/lib/spark-1.0/lib/elasticsearch-hadoop-2.1.0.jar
scala>
import org.elasticsearch.hadoop.mr.EsOutputFormat
import org.elasticsearch.hadoop.mr.EsInputFormat
import org.elasticsearch.hadoop.cfg.ConfigurationOptions
import org.apache.hadoop.mapred.{FileOutputCommitter, FileOutputFormat,
JobConf, OutputFormat}
import org.apache.hadoop.fs.Path
import org.apache.hadoop.io.{MapWritable, Text, NullWritable}
val jobConf = new JobConf(sc.hadoopConfiguration)
jobConf.set("es.resource", "myindex/mytype")
jobConf.set("es.query", "?q=*")
val esRDD = sc.hadoopRDD(jobConf,classOf[EsInputFormat[Text,
MapWritable]],classOf[Text],classOf[MapWritable])
//*up to there everything ok *
es.count()
---->
java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass
at
--
You received this message because you are subscribed to the Google
Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to elasticsearc...@googlegroups.com <javascript:> <mailto: elasticsearch+unsubscribe@googlegroups.com <javascript:>>.
To view this discussion on the web visit
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.