Trying to push 17GB folder size of different log file into ES 8GB centos machine.
What are the best way to adapt for this .
I did the following
Added /usr/share/elasticsearch/bin/elasticsearch ES_HEAP_SIZE = 2gb
Added elasticsearch.yml = index.codec: best_compression
/etc/sysconfig/elasticsearch MAX_OPEN_FILES=65535 MAX_LOCKED_MEMORY=unlimited
/etc/security/limits.conf elasticsearch - nofile 65535 elasticsearch - memlock unlimited
I am trying to push one entire log file into one document like that i need to push all the 17 GB data in ES under one index
Observation :
After some time ES status goes to red and data stop populating into ES from LS.
Kindly help in understanding the memory allocation in ES ??