Can Elasticsearch handle lots of updates and how it scales with concurrent inserts/updates

We are currently using MongoDB for both of our read write query patterns. However we are seeing a lot of performance issues(as expected) where customer runs full text searches in MongoDB or run complex aggregation queries. We are planning to move the full text search capability to elastic and hence we are going to duplicate the data for that collection in MongoDB and Elastic faciliating searches from ES while our CRUD operations go to mongoDB. But we have another collection which we are thinking to move to ES only and not store going forward in MongoDB. For that collection we do sun multiple insert/update usecases and collection size can be huge. I have read updates are expensive and Elastic is not great with updates. Do you see risk around how Elastic will handle lots of updates? Any suggestions/recommendations?

For the best performance with Elasticsearch it is important to use bulk operations. If you are performing operations individually performance can suffer significantly.

In my experience Elasticsearch can handle large volumes of updates as long as these are spread out and you are not frequently/concurrently updating the same documents.If some documents have very high update rates you will experience a lot of overhead which will often result in poor performance.

Also note that data ingested into Elasticsearch is not immediately available for searching. You can force a refresh on each write/update, but this further adds overhead.