Hi all,
I am indexing thousands of documents which contains hundreds of metadata with one field containing over 5000 words and rest of the fields containing 3-5 words each. I am using Edge Tokenizer to perform autosuggestion and term_vectors to use fast vector highlighting. I have noticed my index size has increased over 50% by using these two features. Will this affect my future search performances since I have to index many more documents and is it a bad procedure to have large index data?
Is there any disadvantage of using large sized indices?
Elasticsearch version -6.3.2