Empty strings not being allowed for insertion into ES from Spark

There is an exception being thrown when I execute my Scala app with functionality of myRDD.saveToEs. My ES version is 2.3.5.
I am using Spark1.5.0 so maybe there is a way to configure this in the SparkContext which I am not aware of. The mini stacktrace is as under -
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 2.0 failed 1 times, most recent failure: Lost task 0.0 in stage 2.0 (TID 2, localhost): org.apache.spark.util.TaskCompletionListenerException: Found unrecoverable error [127.0.0.1:9200] returned Bad Request(400) - failed to parse [foo_eff_dt];Invalid format: ""; Bailing out..

Please help/guide.
TIA.