Adjusting limit to App Search Index Ingestation

I have a corpus of 1.5 million records, and from understanding there is a limit to the number of documents that can be imported with each request. I am running a deployment in the cloud, so clearly I am governed by that rate the VMs can process.

Is there a way to increase the number of docs that a single request can process? Otherwise I think it becomes a matter of dividing up records and fire as many as my servers can handle.

Also, does "standard" Elastic Search API offer a different degree of capability with ingesting records? I find the 100 per request a bit limiting and it seem App Search is forcing more development for concurrent batches than what I had hoped.

Thanks in advance.

Hi @David_Robbins,

Unfortunately, the 100 document limit is hard-coded and not configurable. I can tell you with confidence this type of thing is top of my for our team when looking at the future of App Search/Enterprise Search, but I can't give an exact timeline on when to expect this to be available.

Best,
Brian

1 Like

Thanks Brian. I've spent quite a bit of time experimenting the the GCP Marketplace integration, and while it is not supported functionality, I was able to use the GCP BigQuery DataFlow template to update the AppSearch index. If I had a choice between bumping up the limit with the API or using the GCP BigQuery integration I would choose the GCP route.

Thanks again - eager to see the next release features.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.