Elastic-Spark connector : How to read data fro ES Index which has nested Json with array fields

I have use case where I have same filed names in Json inside array fields as below.
I have included "es.read.field.as.array.include" : distrChan.Notifications.OfferNtfy.AId , distrChan.Notifications.OfferNtfy.EId while reading ES index.
But i facing issue as both fields has same name except type lowercase and uppercase. I would like to read only EId here. Coul dyou please let me know how to read only EId and not eid . If i include only EId in es.read.field.as.array.include. It is giving issue. Please let me know any solution to pass schema explicitly instead of inferring ES index schema.
"distrChan": [
"enrollCap": "",
"featuredOffer": "",
"geoTargetInd": "",
"geoTargetMthd": "",
"geoTargetParam": "",
"channelDispStartDt": "
"channelDispEndDt": "*****",
"DistrChanId": "7",
"GeoTargetVal": [
"Notifications": {
"OfferNtfy": [
"EId": "37",
"AId": "123",
"EId": "123456",
"AId": "1234545",

I'm not exactly sure what you're trying to do and what's failing. Can you post everything needed to reproduce the problem (creating mappings, inserting data, and the spark commands), and the error(s) you are seeing? Also have you tried setting

spark.sql("set spark.sql.caseSensitive=true")

I believe that spark sql is case insensitive by default.