Is there any way to make indexing of documents with large number of nested
objects less dependent on memory?
I need to index documents containing array with up to million objects with
this mapping:
'myobject': {
'properties': {
'name': {'type': 'string'},
'foo': {'type': 'long'},
'bar': {'type': 'long'}
}
}
I have set mapping field _all to{"enabled" : false},
and it helped a lot,
but for biggest files i still get OutOfMemoryError.
I tried setting type both to "object" and "nested".
Any tips?
Thanks
--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/aa99b9b8-e46a-4d94-a910-ea585c0935ec%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.