ElasticSearch Spark

I have a dataframe and writing it to the ES

Before writing to ES,I am converting the EVTExit column to Date, which is in EPOCH.

**workset = workset.withColumn("EVTExit", to_date(from_unixtime($"EVTExit".divide(1000))))**

workset.select("EVTExit").show(10)
+----------+
| EVTExit|
+----------+
|2014-06-03|
| null|
|2012-10-23|
|2014-06-03|
|2015-11-05|

As I can see this EVTExit is converted to Date.

**workset.write.format("org.elasticsearch.spark.sql").save("workset/workset1")**

But after writing it to the ES, I am getting it same EPOC format.

"EVTExit" : 1401778800000

Can anyone have the ideas whats going wrong here.

Thanks

Did you set ES mapping correctly?

Also, try setting es.mapping.date.rich to false. (https://www.elastic.co/guide/en/elasticsearch/hadoop/current/configuration.html#cfg-field-info)

val df2 = df.withColumn("EVTExit_1", $"EVTExit".cast("string"))
df2.write.format("org.elasticsearch.spark.sql").option("es.mapping.date.rich", "false").save("workset/workset1")

ya it did the work.Thank You.