Parent Child queries running slow

Here is my current elsticsearch dataset layout. We are using parent child type mapping for our data.
We have around 100 million parents and 3 billion childs for this parents.
My Index mapping:
{
geneticsdb_v4: {
mappings: {
variant: {
properties: {}
},
allelefreq: {
_parent:{type:variant},
properties:{}
},
}
}
}
I loaded my all documents to above index with above mentioned mapping.
Now I am trying to retrieve data using has_child queries but they became very slow.
I am trying improve my index performance using fielddata.
"fielddata": {
"loading": "eager_global_ordinals"

I update my index mapping for fielddata on _parent. After I ran this here is my current mapping:

{
My_index: {
mappings: {
variant: {
properties: {}
},
allelefreq: {
_parent: {
type: "variant",
fielddata: {
loading: "eager_global_ordinals"
}
},
properties: {}
},
}
}
}

Still my queries running slow even after updating my mapping.

Do I need to reindex whole data ?

That is a lot.

How many nodes do you have? What version are you on? What is their config?

We have around 10 nodes and index created on 24 shards. Each shard size is around 240 GB.

That's very large. I'd suggest you may need more nodes.

What version are you on?

Shard size will affect query latency, and those shards are considerably larger than what we usually recommend. Queries are executed in parallel against all shards, but each shard is processed in a single threaded. Having few, large shards can therefore be slow and limit the amount of concurrent processing. Create an index with a single shard and gradually increase the data volume while running queries and monitoring the latencies. This should give you an idea about the maximum shard size you should aim for.