I post several large text files, which are about 20~30MB and contains all
the text, into ES. And I use the attachment mapper to be the field type to
store these file.
It cost memory very much. Even when I post one file, the used memory grows
from about 150MB to 250MB. BTW, I use the default tokenizer for these field.
Although this file can be generated many tokens, but what I don't
understand is the memory cost. Does it store all the tokens into memory?
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to email@example.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/2f200f67-7024-4cdd-9c68-05875f0155ca%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.