Field data is too large

I have a problem with sorting namely, sorting work but only for price field. When I try to sort by start_date, end_date, uid, cat title get the message about exceeding the limit:

Data too large, the date for [ "name of field here"] would be larger than the limit of [19798897459 / 18.4gb]]

I do not know why this is happening code looks correct sample query for elastica looks like this:

Mapping:

"auctions": {
                "_all": { "enabled": false }, 
                "properties": {
                    "cat": { "store": true,  "type": "long" }, 
                    "curr": { "index": "not_analyzed",  "store": true,  "type": "string" }, 
                    "end_date": { "store": true,  "type": "long" }, 
                    "price": { "store": true,  "type": "long" }, 
                    "start_date": { "store": true,  "type": "long" }, 
                    "tcat": { "store": true,  "type": "long" }, 
                    "title": { "store": true,  "type": "string" }, 
                    "uid": { "store": true,  "type": "long" }
                }
            }, 

/_stats/fielddata?fields=*

{
    "_shards": {
        "total": 10,
        "successful": 5,
        "failed": 0
    },
    "_all": {
        "primaries": {
            "fielddata": {
                "memory_size_in_bytes": 19466671904,
                "evictions": 0,
                "fields": {
                    "_id": {
                        "memory_size_in_bytes": 0
                    },
                    "cat": {
                        "memory_size_in_bytes": 0
                    },
                    "price": {
                        "memory_size_in_bytes": 3235221240
                    },
                    "title": {
                        "memory_size_in_bytes": 16231450664
                    }
                }
            }
        },
        "total": {
            "fielddata": {
                "memory_size_in_bytes": 19466671904,
                "evictions": 0,
                "fields": {
                    "_id": {
                        "memory_size_in_bytes": 0
                    },
                    "cat": {
                        "memory_size_in_bytes": 0
                    },
                    "price": {
                        "memory_size_in_bytes": 3235221240
                    },
                    "title": {
                        "memory_size_in_bytes": 16231450664
                    }
                }
            }
        }
    },
    "indices": {
        "allek": {
            "primaries": {
                "fielddata": {
                    "memory_size_in_bytes": 19466671904,
                    "evictions": 0,
                    "fields": {
                        "_id": {
                            "memory_size_in_bytes": 0
                        },
                        "cat": {
                            "memory_size_in_bytes": 0
                        },
                        "price": {
                            "memory_size_in_bytes": 3235221240
                        },
                        "title": {
                            "memory_size_in_bytes": 16231450664
                        }
                    }
                }
            },
            "total": {
                "fielddata": {
                    "memory_size_in_bytes": 19466671904,
                    "evictions": 0,
                    "fields": {
                        "_id": {
                            "memory_size_in_bytes": 0
                        },
                        "cat": {
                            "memory_size_in_bytes": 0
                        },
                        "price": {
                            "memory_size_in_bytes": 3235221240
                        },
                        "title": {
                            "memory_size_in_bytes": 16231450664
                        }
                    }
                }
            }
        }
    }
}

Any ideas what might cause this exception?

I should add that I've tried these solutions:

FIELDDATA Data is too large

But the effect was even worse, then no query did not work as quickly.

For any help I will be extremely grateful!

As per the field data, the title field is using ~15GB of heap.
Can you please let me know your heap settings.
Please enable doc_values for title filed which will reduce the filed data memory pressure.

You are also hitting, field data circuit breaker which will be default to 60% of heap value.
If you enable doc_values to true for title field, the field date will be stored onto disk instead of heap.
Please let me know if you need any further help.

I`m not configure server for elasticsearch but could you tell me:

  1. How I can get heap settings?
  2. How enabled doc_values for title field? (this error is not only for title , its for all fields)?