Create Index with Spark adding Dates and Doubles

Using Scala I create a complete index in a Spark RDD of Maps then use this in the following form:

esFields.saveToEs (esIndexURI, Map("es.mapping.id" -> "id"))

Where esFields is an RDD of Maps, one element is ("id"->docID) which is mapped to the Elasticsearch "id". I have no problem adding other strings or arrays of strings to the _source or a doc so all is working fine.

Now I want to add other types to the index, like a timestamp and a Double. The date will be used in a date range filtering query. How do I add a date to the index using a Scala Map?

Can I have a Map[String, Any] and expect saveToEs to determine the types during the save? so the Maps in the RDD would be something like:

val m = Map(
  id-> "docID",
  category -> Array("cat1", "cat2"),
  timeStamp -> someDate )

If this is correct what type should someDate be? I read these as ISO format strings and so can create any Date type needed. I need Elasticsearch to index these as dates so date ranges can be used in the query.

1 Like