Yet another Error : java.lang.NoSuchFieldError: FAIL_ON_SYMBOL_HASH_OVERFLOW

Hi All,

I'm trying to run a Spark job that uses the Elasticsearch client directly (instead of using elasticsearch-hadoop jar). I'm running into the NoSuchFieldError that has been seen my others. I've followed the advice on those posts to ensure that there are no other jackson jars making it on to the classpath and yet the error persists.

$ mvn dependency:tree | grep jackson (after some clean up)
com.fasterxml.jackson.core:jackson-annotations:jar:2.6.2:compile
com.fasterxml.jackson.core:jackson-core:jar:2.6.2:compile
com.fasterxml.jackson.core:jackson-databind:jar:2.6.2:compile
com.fasterxml.jackson.dataformat:jackson-dataformat-yaml:jar:2.6.2:compile

com.fasterxml.jackson.module:jackson-module-paranamer:jar:2.6.2:test
com.fasterxml.jackson.module:jackson-module-scala_2.10:jar:2.6.2:test

org.codehaus.jackson:jackson-core-asl:jar:1.8.8:provided
org.codehaus.jackson:jackson-jaxrs:jar:1.8.3:provided
org.codehaus.jackson:jackson-mapper-asl:jar:1.8.8:provided
org.codehaus.jackson:jackson-xc:jar:1.8.3:provided
org.json4s:json4s-jackson_2.10:jar:3.2.10:provided

I did notice that the older codehaus jars are also there (in the provided scope coming from hadoop and spark cdh5.8.0), but since the package name has changed, it shouldn't pose a problem.

Please advise.

Thanks,
Pradeep

It looks like Hadoop MapReduce is pulling in Jackson 2.2.3. Since I didn't include the specific artifact in my pom, the dependency tree wasn't showing it.