Spark sql pushdown conjunction filters

Sorry for the late answer. My versions:
"elasticsearch-spark" % "2.2.0-beta1"
"spark-core" % "1.5.2"
elasticsearch-2.0.0

my dataframe has 4 fields: title,recordStart,recordEnd,recordDuration

"recordEnd": {
       "type": "date",
       "format": "dateOptionalTime"
},
"recordStart": {
	"type": "date",
	"format": "dateOptionalTime"
},
"recordDuration": {
	"type": "long"
},
"title": {
	"type": "string",
	"fields": {
		"raw": {"type":"string", "index":"not_analyzed"}
	}
},