Does Elastic Cloud have any hard limits on http.max_content_length

Neither OpenSearch nor Elasticsearch are really designed to cope with individual docs that exceed 100MiB, so raising the max_content_length will just lead to other problems and hence isn't something you can do. Typically this only matters if you're storing enormous amounts of unsearchable binary data (image-heavy documents or videos) and in that case there's no real need to store it directly in the search engine. Instead, store it in a separate blob store and store only a link to the blob in your search engine.

Regarding whether Logstash can split large binary objects across multiple documents, you'll need to ask the Logstash folks about that. I suggest opening a separate topic in the Logstash forum.