Spark & ElasticSearch clash (guava)

Hey, so i have a piece of code which index records into elastic. This code is running with spark and hadoop.
I just upgraded Elasticsearch to 2.3.1.
When i'm running my code on the local machine it works great.
When i'm trying to run it with the spark submit job, i'm getting

java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;

After searching google, i realized the problem is with guava, so in my pom.xml i just put the

com.google.guava
guava
19.0

under the dependencyManagement.

but the error stil happen, so i guess spark (1.6) is also using an older version of guava, but i can't find where and how to solve it...