Exception when using Elasticsearch-spark and Elasticsearch-core together

Thank you for your hints.

I solved this issue by adding

exclude("org.apache.spark", "spark-network-common_2.10")

instead of

exclude("com.google.guava", "guava")

It looks like spark-network-common_2.10 contains guava classes.
So the finally build.sbt includes the following...

// Exclude jars that conflict with Spark (see https://github.com/sbt/sbt-assembly)
libraryDependencies ~= { _ map {
  case m if Seq("org.elasticsearch").contains(m.organization) =>
    m.exclude("commons-logging", "commons-logging").
      exclude("commons-collections", "commons-collections").
      exclude("commons-beanutils", "commons-beanutils-core").
      exclude("com.esotericsoftware.minlog", "minlog").
      exclude("joda-time", "joda-time").
      exclude("org.apache.commons", "commons-lang3").
      exclude("org.apache.spark", "spark-network-common_2.10")
  case m => m
}}