Spark 2.0.1 one element array issue


I'm using elasticsearch-spark-20_2.10-5.0.1 with spark 2.0.1

I have an object which contains two arrays (mo_service_type, mo_tag) which can contain only one element:

"_source": {
"mo_service_type": [
"mo_tag": "glad",

var esConfig:Map[String,String] = Map(
"es.nodes" -> ""
,"es.port" -> "9200"
,"" -> "mo_service_type,mo_tag"
val heartbeatDF = EsSparkSQL.esDF(sqlContext, "heartbeat-*/heartbeat", esConfig)

this one works:
select mo_service_type from hb limit 1

but this one fails:
select mo_tag from hb limit 1

How can I read this data?

com.databricks.backend.common.rpc.DatabricksExceptions$SQLExecutionException: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 231.0 failed 4 times, most recent failure: Lost task 0.3 in stage 231.0 (TID 6051, scala.MatchError: glad (of class java.lang.String)
at org.apache.spark.sql.catalyst.CatalystTypeConverters$ArrayConverter.toCatalystImpl(CatalystTypeConverters.scala:160)
at org.apache.spark.sql.catalyst.CatalystTypeConverters$ArrayConverter.toCatalystImpl(CatalystTypeConverters.scala:154)
at ...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.