Indexing very large and searchable documents

Hi there !

Has someone already tried to index very large documents in Elasticsearch (>300Mo) ?
When we try, we've got an OutOfMemoryException, with GC messages.

[INFO ][o.e.m.j.JvmGcMonitorService] [KR7Jg8N] [gc][1395] overhead, spent [331ms] collecting in the last [1s]
[WARN ][o.e.m.j.JvmGcMonitorService] [KR7Jg8N] [gc][old][1415][22] duration [41.8s], collections [1]/[42.1s], total [41.8s]/[8.5m], memory [15.9gb]->[15.9gb]/[15.9gb], all_pools {[young] [124mb]->[127.6mb]/[133.1mb]}{[survivor] [0b]->[0b]/[16.6mb]}{[old] [15.8gb]->[15.8gb]/[15.8gb]}

In dev mode we're using 1 node (ES 6.5) with Xmx 16Go RAM, with Docker.

We saw the http.max_content_length param and have already increase it to 600mb.
https://www.elastic.co/guide/en/elasticsearch/reference/6.5/modules-http.html

Has someone a feedback and technical recommendations for dealing with it ?

Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.